With continued research on brain machine interfaces (BMIs), it is now possible to control prosthetic arm position in space to a high degree of accuracy. However, a reliable decoder to infer the dexterous movements of fingers from brain activity during a natural grasping motion is still to be demonstrated. Here, we present a methodology to accurately predict and reconstruct natural hand kinematics from non-invasively recorded scalp electroencephalographic (EEG) signals during object grasping movements. The high performance of our decoder is attributed to a combination of the correct input space (time-domain amplitude modulation of delta-band smoothed EEG signals) and an optimal subset of EEG electrodes selected using a genetic algorithm. Trajectories of the joint angles were reconstructed for metacarpo-phalangeal (MCP) joints of the fingers as well as the carpo-metacarpal (CMC) and MCP joints of the thumb. High decoding accuracy (Pearson's correlation coefficient, r) between the predicted and observed trajectories (r 0.760.01; averaged across joints) indicate that this technique may be suitable for use with a closed-loop real-time BMI to control grasping motion in prosthetics with high degrees of freedom. This demonstrates the first successful decoding of hand pre-shaping kinematics from noninvasive neural signals.