I have a transient thermal loading I need to apply. I can easily make a Data Surface for time or distance, but how do you go about making the load vary in both? There is no uniform scaling that can be applied, so a single function coupled with a data surface will not work to translate the loads in either dimension. I can make a new data surface for each set of loads, if that is really the only solution,but I figure/hope there is an easier way to do it.
In my limited Data Surface experience, if this is the only way to do it, it would be easiest to do a surface for each time step, to allow a smooth transition in temperatures down the length. Or would a definition of a temperature at specific 'X' locations over time be easy to implement? I just have a harder time visualizing how to do this one, though I expect the linear interpolation of the specified properties would still exist down the length, and it would not result in stepped or single point loads.
I was thinking another option would be to make a function of each set of loads, but I have not been able to sort out a way to appy the different functions to a specific time or location, except to create a while new load definition for each one.
An example of the loads is here, with X location in the top row. My actual matrix is significantly larger.
I am assuming that your temperature results did not come from a Femap/Nastran transient thermal analysis?. If they did, then Model | Load from Output is the fastest and via a fairly simple macro (or otherwise an API) to repeat this command to create a new temp load set from each transient thermal analysis.
The general philosophy with Nastran is that the time functions are numerous data points in time which multiply the base load value. I don't think the Nastran structure lends itself to double multipliers in the way you want: one for time and one for space - in a single load set. Femap can have a data surface and a time function, but the time multiplier is multiplying by one single spatial multiplier (one data surface), as you note.
Note that... IF (and I know it's a big if) you could get your spatial node numbering nicely ordered, you can easily end up with "x" time functions for "x" nodes (use same ID for each) and API these together or apply these directly via the text Nastran .dat... but if these temps are at locations where you need Femap to interpolate spatially, then you have many data surfaces and many load sets to create.
Thank you. I was fearing that I would have to make data surfaces for each load, and it seems like I have to. The thermal loads come from elsewhere, so there is no direct way to import them, and they are values to be interpolated between the given points. I was working on an API to load them, but was getting stuck and so I had one of the new engineers with a bit of time enter all the data.
The next issue is that I can't set up subcases in thermal transient, so it looks like I have to run each 1 second load as a separate analysis, starting from the end of the prior analysis. We were hoping to be able to chain them as a string of subcases so they would run without user input. Am I missing something, or for this particular case - a transient thermal load that must be defined with individual loads vs a function - it is only possible as multiple independent analyses?
I am unclear whether you are running a transient thermal analysis (to get detailed transient temp results) or a thermal stress (mechanical) analysis to get deflections and stresses from the supplied temps over time. If you are doing a thermal stress analysis, you can run a multi-case non-linear analysis, where each case uses the respective temp load set with your respective spatial data surfaces. One mech case per second does sound like a lot though!
If it's purely thermal, then perhaps you can fudge the spatial interpolation by analysis! For example, you find the nearest node to your data point (this would all be best via API), and you apply the time/temp mutliple data pairs to that node as a time function. Repeat for all the other data point locations. Choose a (sensibly) high thermal conductivity / low specific heat and run it as transient thermal. (Best if you first run the starting temps as steady state, and use that SS result as initial condition for the transient analysis, via Model | Load from Output). Anyway, the transient results you get shouldn't be much different to a spatial interpolation of temps from your coordinate data points. It will only be "bad" if your coordinate points are nowhere close to a node in the model - which will probably be a limitation anyway if you used an Output Map Data Surface to achieve the same purpose. Check the results to be sure they are logical - ie. max transient temp obviously shouldn't be 400 if your max temp input was 240!
Now you have a full collection of time / temp results for all the nodes in the model, which can then be used for thermal stress if that's the ultimate goal. Remember to put your specific heat & conductivy back to the real values if you intend to do some other thermal analysis (eg. transient free cooling) from any of your time step positions.