Digital "grain" doesn't exist. Grain in film is literally crystal grains in the emulsion that become visible because there's only so much fine detail you can record in crystals and how you needed much larger crystals to get more light sensitivity.
Noise is what digital systems get, and it's noise in the signals processing sense (signal=information, noise=randomness). This distinction is important because it's entirely possible to get digital images that are almost entirely free of noise. You simply have to stay within the dynamic range of the sensor. Most typically, this comes in the form of shooting at the native ISO with a cool sensor. What introduces noise into the image is increasing ISO, which when looked at from an electronics point of view is introducing gain via amplification circuits. The more you do this, the more you degrade the signal and increase the noise. Think of it like turning up the radio way too loud and how the sound gets distorted and fuzzy.
Anyway, when digitizing film, there's a bunch of stuff you can do to eliminate basically all digital noise like taking panos of the film you're digitizing and stitching it together, ensuring good illumination, ETTR exposure, and exposure stacking.