Hi all,
i recently saw that with the compound part function the stresses in parts are wrong.
I set up an example for a glued connection. Here i do have 2 Steel Parts which are connected with glue:
Now i modeled two different connection methods one with compound part and the other one with the tie command.
The result peak von mises stress in the glue should be around 5 MPA.
This is true for the tied connection:
My guess now is that at the transition from steel to glue the compound part creates elements which share integration points at the junction. Maybe then the software doesn`t now which material stiffness value to choose from for the integration points.
Actually, I wouldn’t call this a bug since the compound part function seems to fulfill its purpose very well. The stress concentration that you get here is likely just a limitation of the modeling approach itself. By the way, tie constraint may also cause unexpected stress concentrations in some cases. Here’s an example:
The reason, in this case, could probably be the built-in averaging of nodal values of CalculiX. If you are using compound parts the mesh of different material areas is merged but CalculiX does not account for different materials while averaging. It averages all elements of a node. On the other hand, if you are using tied connections the mesh is separate and so is the averaging.
I am not aware of an option to tell the CalculiX to average the nodal values based on the material (this information is available in the .frd result files so it might be an option).
In there one will find that for a fully modeled sandwich an mpc was created between the core and facesheets to avoid stress averaging (similar to tied constraint)
In the Calculix Help for *EL FILE is written:
"Due to the averaging process jumps at material interfaces are smeared out unless you model the materials on both sides of the interface independently and connect the coinciding nodes with MPC’s."
So again creation of MPC`s is recommended.
I used this compound Part function without knowing about this bug. My workaround is now the tie constraint.
What i really liked about the Compound Part was that you defined a fine mesh for the glue and the surrounding mesh was fine around the glue area and coarse everywhere else.Now i have to split the parts at the area of the glue to achieve the same result.
But biggest Problem of all is probably users who dont know about this issue and use it for Stress analysis.
Do you think that an automatic tie constraint could be implemented between the transition of two surfaces of different parts ?
This would also help when parts are not 100% aligned to each other and the compound part function is used.
For now, the compound part feature by default creates a merged mesh. I do not think it is a bug. If you use the setting Split compound mesh in the mesh parameters settings, you will get the same mesh only that the part meshes will be split. Then you can use Tie to connect them together.
Is there a difference between using MPCs or a Tied constraint on a mesh with coincident nodes?
Okay i agree the split compund part function offers a great opportunity to create a tied constraint afterwards. This is for me the best of two worlds, because then my nodes are aligned directly on each other. Maybe my understanding for compound part is wrong, one should only merge parts with the same material.
I think that there is no “correct” stress solution, as the stresses are singular at the end of the adhesive strip. It seems that the Al side of the tie joints are slave and the glue sides are master. Due to the z constraint of the Al side which has a row of nodes in common with the tie surface, this row of nodes is not active in the tie constraint and you get this pattern:
On the other side this doesn’t happen, as the z constraint is applied to the outer side of the al bar.
Yet, if you try to refine the mesh on the other end, you still get strange patterns, because the mesh on the slave side is now coarse and on the master is fine. Slave side should be the one with the finer mesh.
Actually, it is not easy to see which side is slave and which is master if both surfaces are congruent.
One would need a visual clue on what part the surface belongs to, e.g. by making all other parts transparent or something like that when you highlight the entry in the tie dialog.
As it is now, you have to “show only” the part in order to actually see the surface.
Also, a button “Swap master/slave” would be helpful, because doing this is quite cumbersome now, because it involves multiple visibility changes.
As to the aspect of nodal averaging
The script separate.py on my example page completely suppresses nodal averaging in the model. This is done by giving every element it’s own global nodes and writing equations to tie the nodes. The procedure is applied to an existing input file (mesh) and doesn’t produce mpc conflicts, because the equations only eliminate the added nodes. It should be possible to restrict this procedure to element sets instead of individual elements. Yet, in contrast to other FEA programs, where you can select nodal averaging or non-averaging in postprocessing, this would be a decision on model data level.
With tied constraint, dependent and independent side (slave/master) have to be uniform for the whole interface, whereas with mpc (equations) you could select that pointwise (e.g. to avoid constraint conflicts).
I can’t run the pmx file given in the topic starter post (first one) in version 1.3. The pmx file is made in 1.1.1. It can be opened in 1.3, but you have to correct the z constraint manually (might be more to correct, but that is what I found). Yet when running the simulation, I get this error message:
In order to better distinguish and see master and slave surfaces, there is an Exploded view option.
The swap master/slave option is available on the context menu (right-click) on the model tree.
I will check your separate.py solution and see if I can integrate something like that to CalculiX. The other possibility would be to save the results in the .dat file and do the post-processing later. My only concern is that such .dat files would be huge.
Thanks for the hint. Exploded view does the trick. A button to swap master/slave in the edit dialog would be nice (to have).
Postprocessing from .dat file would indeed allow for flexibility in nodal averaging. You also could switch between extrapolation and simple transfer of integration point results to the nodes
It would also allow for would display of integration point results as coloured dots at the correct locations, much like this is possible in Z88. That would be interesting for teaching and probably only useful for small models. So there wouldn’t be too much concern about file size in this case.
Separate.py isn’t for free as well in terms of file size. Yet, when restricted to set boundaries, only a small part of the nodes has to be duplicated.
Yes, I was happy that the model from 1.1.1 could be inspected in 1.3 at all. So my hope was that only a small link was missing for complete compatibility.
I would also like a button to swap master/slave but it is a problem to add such a button into the property control. So I tried using the right click to do it in the Tie constraint but I do not like it too much so I did not implement it into the contact form.
Reading .dat file has another drawback; it would require a lot of work to add the required interpolation functions to all element types. And then it would require plotting the results differently. Now, there is only one value in a node that is plotted so the node can be plotted only once. For the representation of the results without averaging the node has to be plotted multiple times, once for each value. And that would require additional development work.
Plotting integration point is a nice idea. Good for 2D cases and also small 3D cases. This would require adding functions to compute the positions of the integration points to all element types. For now such functionality inside PrePoMax is not implemented.