Imprinting your brain onto circuitry in order to fully replicate your current state of consciousness, memories, personality, and thought processes. In other words, YOU are the AI. I personally am looking forward to having conversations with myself - logically, the point at which my brain is copied becomes a splitting point, and after said point the copy will be a separate person, with separate experiences. While it would be a weird feeling, I can't help but wonder if I would still feel like me, or would I become distant from myself, each copy of me regarding the others as being less important. I wonder if I would be able to handle talking to someone who thinks exactly the same way I do, but with unlimited access to the internet; or would I commit suicide out of a feeling of inadequacy relative to this new, more knowledgeable self?
Would I need entertainment still? Would games hold any draw for me if I am literally inside them?
On a similar note, would a "download" be available as well as an "upload?" Could I imprint myself onto circuitry, temporarily put my human self into "standby" while I play a game from a completely immersive point of view, then download the experience into my human brain and feel as if I'd actually been there? Would I decide to forgo the return to the physical world, deciding to live solely in the cyber universe? Would a computer virus be able to corrupt my brain patterns? If so, I would hope I should be smart enough to make regular backups before hopping into the cyberme, automatically recognizing corrupt data and restoring the last backup.
This line of thinking deserves its own thread, honestly. There's so much to discuss on the possibilities; let alone the hows, whys, and wherefores.
OMG the porn implications!
In all seriousness though, it seems like not enough details are given about B, C, and D to say what will come first.
My gut feeling is that B will come first since the already-developed brain just needs 5 things: glucose, electrolytes, oxygen, a sterile, biologically familiar environment, and a way to dump wastes of cellular respiration. We could probably do this today if we threw enough money at it and tossed the ethics book in the trash.
If they mean "avatar" as in something capable of normal human activities, then that's slightly more difficult. Because the first "avatars" are likely to be little more than glorified, mobile life-support systems, you would not want your brain in the first avatar.
In physiology, how our bodies deal with (and demand) all the junk we put in ourselves relies on the actions of many different enzymes we've only begun to understand and on pharmacokinetics. Due to practicality reasons, we'll probably toss out most of the complex metabolic pathways involved with breaking down alcohols, proteins, fats, and drugs, and the avatar will only need to consume simple sugars to survive. The avatar will probably lack a traditional stomach, so most of the sensation for hunger will be gone, but there will probably still be some lingering desires for comfort food fueled by past memories. And much the same for any other pleasure that any regular human would have.
Sure you can accommodate for these things (give the avatar the feedback-enabled sensory organs to actually fulfill these desires and a container to store the non-fuel food it "eats"), but that's a lot of R&D bucks to scratch an itch.
If the avatar is supposed to be self-maintaining/repairing, that's another (probably bigger) can of worms.
On the plus side, you can probably beat anyone in a drinking challenge, since your avatar body would not metabolize the alcohol at all, unless ethanol is actually your fuel (but just the one powering your artificial muscles, not the one keeping your brain going of course).