I have been focused on VB6/ASP/COM to ,NET migration pretty much exclusively since 2005 -- in fact I design, develop, and sell migration tools and deliver migration projects for a living. Our migration tools can delete redundant code and restructure code, but they do NOT work automagically -- they are software and they are developed by programmers. In our case the lead developer is a mathematician and linguist who has been writing compilers and translators since 1974). Our VB6/ASP/COM translator is actually a compiler hooked up to a state of the art decompiler with a high performance information management system in the middle. The information management system is screaming fast and implements many algorithms that analyze and transform the compiled VB6/ASP/COM to a form that is faithful to the original semantics of the source and also compatible with being re-authored as .NET. The decompiler (aka the Author) is programmable so the user can generate code that fits your standards.
The most important training that I see for a migration team is training themselves so they know what they are getting into, so they can make intelligent choices about how they want to code their app in .NET, and so they know how maintain and enhance their system after the migration. Tools cannot learn for you, this is something you have to do for yourself.
The standards point you brought up is also critical -- most organizations cannot even agree on standards internally so it is folly to expect the "standard" code produced by a tool right out of the box to be right for everyone. Different applications serve different needs and this drives different standards. These different standards need to be specified in a form the translator can use -- that is like "training the process" you mentioned above -- we actually call it tuning the translation process. Of course a translation tool has to be flexible and allow sophisticated user-defined migration rules to be "tunable".
When people tell me they just have to rewrite a huge legacy application, I like to present this analogy: lets say you had to do a massive data conversion: millions of records of data (some of it a little 'dirty' or maybe very 'dirty') in hundreds of tables being restructured and layed out into hundreds of different tables according to a new schema. What would you think if some consultant told you "just freeze the data for a few months and pay us to re-enter it." You would say no thanks. You would say use tools! But you would NOT just take the output of the first version of the conversion process and blow it into production right? You would test, tune, and refine, and test again until you were certain the process produced data that was complete and correct according to the new schema and business rules.
IMO, The sensible way to migrate very large VB6./COM apps to .NET is similar to the large data conversion. You need to test, tune and refine the conversion process before you "cut over" to the new code. With our tools, this "tuning" is done primarily by creating migration scripts and other refactoring rules rather than modifying the VB6/ASP code. We generate inspect, play with, rework, and learn from intermediate versions of the translations then put what we learn back into the tool configuration and retranslate it all again, and again, until we get .NET code that we like -- code we are confident we can finish to production and take forward after the migration. Clearly original redesign/rewriting work is part of this process even though you are using tools to help you reimplement your application.
There is more info relevant to this topic at
http://www.greatmigrations.com