Once you have exported your files from Maya (or whatever) you have an .fbx file asset that you can use in Unity.
Most of the people that I have followed to get to this point use Visible Meta Files. To be honest, I really don’t know why this is done yet. I know that it has to do with being able to use programs that manage your assets for version control and the like. As I have yet to work in a team, this has not yet presented itself as a necessity, but I imagine that it will eventually be something that will be important.
But even if you aren’t working in a team you should know something about using this method and that is that the meta file stores all of the information that unity uses for your asset. As far as I can tell the .fbx file itself is more like a reference file and the meta file is what unity is using to show how it behaves. So if you are copying an asset from project to project you need the meta file to go with the asset. Just dragging the two files into a new project will allow unity use the asset as it was designed in the other project. I assume that if you didn’t have visible meta files that the information would be stored in the asset it’s self (the .fbx file) I don’t know this for sure at this point.
About 3 years ago I decided to start making a 3d character for a platform game. I just decided that I wanted to learn game design and the way I would teach myself would be to make the thing that I most wanted to do and that is character design, rigging and animation for games. I now have a largely functioning character that does most of the stuff I had set out to learn! It was however, a very hard won battle full of setbacks and complications I couldn’t have possibly known about prior to just running into them during the process.
Hopefully you will find this as you begin rigging your character and you can benefit from the problems I ran into and avoid them yourself.
BTW I am not saying this is the definitive way to do this, it’s just the method I developed that works for me. There are many approaches so by all means if you find a better one, go with it. (And let me know what it is!)
Last thing’s first:
Exporting Your Character and Animations for Use in Unity:
I bake all of my animations to the skeleton and delete all other controls. I understand that Unity can take the maya file with its controls, but I have my doubts about how well that works. So just to take a bunch of unknowns out of the mix and keep things clean, I do the bake.
Have all of the animations in different FBX files. While you could have all of the animations on the same model that you export into unity for use in your game that is a bad idea. The reason why is that you are going to find problems in your animation and model that you will need to go back and fix. If your animations were all divided up into different files then you only have to fix the ones that are broken.
Have the model that you are using in your game in T-pose with no animation on it. Again, you are going to have to update your character and having it as a separate file that can be readily swapped out is better than having it loaded full of a bunch of animations that may have to be recreated when you import the model again.
Skinning your character for use in Unity 3D:
At this time, Unity supports a maximum of 4 influence bones per vertex. Which is plenty, but Maya allows for as many as you like. Make sure that when you skin your mesh that you limit the influences to 4 and maintain max influences. If you have more than that Unity will optimize that on import and will just decide for itself what weight should go to what bone.
Animated platforms for your character to jump on present some weird factors that you have to take into consideration when you are building and animating the mesh. When creating the platform have a null object above the mesh that holds the animation. For some reason animating the mesh will create strange results when your character is parented beneath it. More on rigging a character for 3rd person 3d platform game.
Something that you should know right off the bat is that sometimes Unity will quit reading a script correctly. This is really irritating because you are troubleshooting a script for its behavior and if there is that added variable of Unity just not processing the script correctly then you may waste time troubleshooting a script that actually works. So if you run into a situation where you are certain that the script should be behaving correctly and it seems to not be creating the desired behavior in your scene, just restart unity. That is frequently the problem. After a while you will just start to get a feeling for when this is happening.
General considerations for animating characters for use in Unity:
First! Make sure you are animating in 30 fps and have the appropriate scale for Unity. Check this in Windows–> Settings and Preferences –> Preferences –> Settings
Linear = Meter
Time = NTSC 30 fps
Important! parallel evaluation mode can be problematic with the HIK rig. It can cause crashes. Change evaluation node to DG (Dependency Graph)
At this point animating in 2016 with viewpoint 2.0 causes weird ghost jerks in the animation that don’t really exist. Drop down to legacy to make sure your animations are accurate.
I like to animate with auto key on, under Animation in the preferences window. I like independent euler-angle curves for both new curve and HIK curve defauts.
I like weighted tangents to be on and default in and out tangents to be auto.
Make sure that all of your joints have Segment Scale turned off. You can find this in the attribute editor. Trying to use a skeleton with any joints in it with segment scale will frequently crash Unity let alone the fact that the import will be useless.
When animating the Root Transform, also called the Reference, don’t rotate in X or Z axis. You should orient your character in Y using this node, but if you rotate in X or Z you will end up with your character coming out of alignment. Your character will keep leaning and tumbling over more with every cycle of the animation that has values in the X or Z for the Reference Joint.
Here is a comparison of a high poly model compared to it’s low poly version that will be used in game. This character’s name is copper and you can play a prototype of her player mechanics now at www.joystick-game.com
There are a lot of considerations when creating UV maps for your game assets.
There are some things that everybody knows:
- try to put uv seams in areas that are hidden.
- if you cant hide a seam have it on a crease.
- try to have uniform uv layout.
Here are some things you may not know when making UV maps for game objects:
- Try to have as few seams as possible because the more UV shells you have the more the system has to calculate.
- The best place for seams is along edges that also represent changes in material for instance the transition between clothing to skin. This is really important and will not just influence your textures, but the way you model too. That is to say that you should try to put geometry in where you have really stark transitions so that you can make a shell seam there.
- You need more fill texture space around your UV shells than you need for traditional rendering. Why? because of mipmaps. Mipmaps are a sort of automatic level of detail for texture files in hardware renders. The farther the camera gets from the object the lower the resolution drops. This has the unintended effect of allowing your texture to decrease in resolution sufficient to miss align with your UV seams. This will show up as lines in your mesh where UV seams occur.
- This is sort of an extension of the mipmap phenomenon, you should cluster your like textures together so that if you have bleeding between uv shells they will blend with other like textures. It also allows you to create a big uniform texture fill behind those shells so that if the mipmap gets too small the texture bleed should hide it.
By the way, mipmaps can be turned off if they are just giving you fits. It’s not really reccomended, but you may want them off for your main character, or something that is going to remain in constant focus in the center of attention. To turn off mipmaps in Unity, select the texture file in your project and on import settings in the inspector select Texture Type–> advanced. There is a toggle switch that can disable mipmaps. You can also tweak them with different calculations that may get the desired results too, but if you are seeing lines, they probably won’t go away without either disabling mip maps or tweaking your uv layout, texture or both.
Having Joint Segment Scale Compensate on any joint in your chain will cause Unity to crash when you enter into the rig configuration window. Make sure that before you export that all of your joints have this turned off in the attribute editor.
New Features > Fully Developed Product…. Right? Does anybody agree with that? Does anybody think its a good idea to turn out the next model without working out all the bugs on its predecessor first?
NOOOOOOOOO! Nobody thinks that’s a good idea. Not with any product let alone a professional production product.
Autodesk is not alone in this. It’s a terrible trend that permeates the software development world and arguably all products. Get the next one out, get the next one out, get the next one out. Hey Autodesk, we as consumers of your product are not your beta testers and it is totally unacceptable to treat us like that.
I guess its designed to drive sales. I hope that software subscriptions will change this culture. Rather than lurching from release to release belligerently failing to address failure I hope that Autodesk will now be motivated not by the revenue of the next release but rather the revenue generated from being a solid, well developed product.