PSD generation
Hi,
A couple questions on particles generation by PSD:
1/ What is the difference between makeCloud(
2/ If we want a mixture with, let's say, x% particles in [r1,r2] and (100-x)% particles in [r3,r4], with no particles between r2 and r3 (staircase psdCumm), will one of the above apply?
3/ Is there a precise reason why the "best" psdScaleExponent is not 3?
4/ It seems particles generation will start with smaller particles and finish with the bigger ones, this is high demanding for the positioning algorithm which tries to find free spots. It would be easier (i.e. it could achieve lower porosities) to fill the voids between big ones with small ones. Ok to invert that part?
Bruno
Question information
- Language:
- English Edit question
- Status:
- Solved
- For:
- Yade Edit question
- Assignee:
- No assignee Edit question
- Last query:
- Last reply:
Revision history for this message
|
#1 |
(4) Does not apply. The ordering is random (right?).
(5) Trying psd.py, it seems the tabular psd is not fitted really well.
Also, size PSD and mass PSD look similar. Am I interpreting something
the wrong way?
On 25/01/11 15:33, Chareyre wrote:
> New question #142804 on Yade:
> https:/
>
> Hi,
>
> A couple questions on particles generation by PSD:
>
> 1/ What is the difference between makeCloud(
> 2/ If we want a mixture with, let's say, x% particles in [r1,r2] and (100-x)% particles in [r3,r4], with no particles between r2 and r3 (staircase psdCumm), will one of the above apply?
> 3/ Is there a precise reason why the "best" psdScaleExponent is not 3?
> 4/ It seems particles generation will start with smaller particles and finish with the bigger ones, this is high demanding for the positioning algorithm which tries to find free spots. It would be easier (i.e. it could achieve lower porosities) to fill the voids between big ones with small ones. Ok to invert that part?
>
> Bruno
>
--
_______________
Bruno Chareyre
Associate Professor
ENSE³ - Grenoble INP
Lab. 3SR
BP 53 - 38041, Grenoble cedex 9 - France
Tél : +33 4 56 52 86 21
Fax : +33 4 76 82 70 43
________________
Revision history for this message
|
#2 |
I found the problem in makeCloud and get good fit now. I'll let you know shortly.
Revision history for this message
|
#3 |
Hi Bruno,
See below for my answers to your questions.
(1) I will try to summarize them here, though there is some documentation in the code:
- makeCloud() has in fact three possibilities: you can obtain a uniform distribution by starting from either radius mean or porosity as input (and the number of balls, ofc) or you can define psdSizes and psdCumm and in that case you should obtain the distribution as you input it. To be honest I would use particleSD for the latter since I do not remember if makeCloud() was working properly in that sense (Vaclav?).
- particleSD() and particleSD2() do essentially the same job. The difference is that in the first case you guess the volume of solids by the number of particles and the mean radius; this means that if your size distribution is rather broad then the final number of balls you obtain is not the same one as you input. Instead, using particleSD2() you will get the same number of balls because in fact the volume of solids is not really another parameter but it is sufficient to input number of balls and porosity together with the size distribution (there is a comment in the code about that, I hope it is clear).
(2) I think none of the above would apply to your case. If you choose to input the percentage of passing that is cumulative so I do not think you can obtain an interval of radii with zero particles.
(3) Vaclav wrote that function and he can probably give you the answer.
(4) It does apply and makes sense but only if you have the list of radii so that you can actually choose to start from the bigger size, otherwise it is random as you already say. Indeed this is what happens in both function particleSD() and particleSD2() since we have the list of radii available there (and it makes a big difference if you start placing the big balls rather than the small ones).
HTH, Chiara
Revision history for this message
|
#5 |
(2) if you specify psdSizes (radii) and psdCumm with makeCloud, you should be able to get any psd you are able to describe by a non-decreasing piecewise-linear function. Therefore, in your case, you would have psdSizes=
(3) I was not able to discover the reason, although I checked the derivation carefully several times. It is true though that I never used that function since Chiara introduced the particleSD method, which is more straightforward and gives lower porosity due to the ability to place bigger spheres first.
Revision history for this message
|
#6 |
Oh! We managed to send 3 posts in the same minute with Chiara!
Thank you both for answers. It seems particleSD uses discrete
distributions (right?) and assume cubic volume, which is a bit restrictive.
1/ The difference between particleSD() and particleSD2() is not very
explicit in the doc.
2/ Vaclav is right, it works (with makeCloud at least), since it results
in zero probability for r2<r<r3.
3/ It's fixed. I re-derived, psdScaleExponent disapears.
I'll commit a makeCloud generating decreasing radii and scaling the psd
down if the target number can't be achived (also retrying retrying
recursively with higher porosity if target poro is to low). Not much
time now.
Bruno
p.s. trying to attach figures, not sure it will fit in lp answers...
Revision history for this message
|
#7 |
It is up to you to choose the shape of the box. With particleSD() you can assign the size of the box as input so it is not necessarily a cubic volume.
(2) I understand now your question. You can do the same with particleSD() then (just set the same percentage for r2 and r3 as Vaclav suggests).
Chiara
Revision history for this message
|
#8 |
So, this comment in SpherePack.cpp (particleSD2) is obsolete?
/* possible enhacement: proportions parameter, so that the domain is not
cube, but box with sides having given proportions */
Revision history for this message
|
#9 |
That applies only to particleSD2, which in fact does not ask to specify the size of the box which is assumed to be cubic for now.
By the way, when you say "scaling the psd down" you mean applying a shift to it, right? That would be quite nice to have it, I agree.
Revision history for this message
|
#10 |
For PSD2 only, ok.
Scaling down : homothetic transformation of the distribution. It looks
like a translation in log axes, see the figures I sent this morning for
the 20k particles case. The only diff. between figures is the target
number of particles.
B.
Revision history for this message
|
#11 |
Bruno, to me it is not clear about what the scaling does. Can you describe it concisely? My idea about it now:
1. Try to place particles as required.
2. If it is impossible to place all of them, then you make particles, which are already generated, smaller, until the rest is put as well.
Right? That means, however, that the size distribution is different (although "only" scaled down). Wouldn't it be better to increase the box instead (move particles homothetically, but without changing their radius), so that the size (and mass) distribution is as was required, although in a larger volume?
Revision history for this message
|
#12 |
It's simpler than that: I made makeCloud(
(such call with all three params would throw currently). If num doesn't
fit, the list of generated spheres is simply erased and
makeCloud(
"Scaling" is a multiplication of all sizes in psdSizes by the same factor.
Is it better to scale down psd sizes or to increase box size? It
depends: physics behind, boundary constraints, taste, etc.
I've always been in the case where it is better for me to scale
particles down. Because Yade's CundallStrack packings have
size-independent behaviour, I can see "num" as a mesh refinement
parameter (if you refine a FEM mesh, you usually don't want to change
the dimension of the problem).
It explains why I can grow particles, or why I define packings via num
and rRelFuzz (never by rMean). I'm not claiming it's fundamentally
better but it is what I need, so I irrationally tend to think it is the
most frequent usage.
A bit more rational maybe : minCorner and maxCorner are the only
mandatory parameters (others have default values), suggesting that it is
not something we should fiddle with.
Anyway, I'll add a way to get the scaling factor from pack, so one can
easily scale up everything again if needed.
Good news : ordering sizes gives 0.55 porosity (else 0.85)!
Bruno
Revision history for this message
|
#13 |
Bruno,
yes the distribution in particleSD() is discrete, I will add more documentation about that. Would you think that a continuous distribution would make a big difference? In DEM papers I can see that both are being used. Any idea about that?
Thanks, Chiara
Revision history for this message
|
#14 |
I don't really know in which cases it could give big differences. It is good if both discrete and continuous distributions can be generated in Yade.
makeCloud works perfectly for the continuous case now (you can try psd.py to see how it works, before bzr2724 or after bzr2748, there is a small bug in between).
Revision history for this message
|
#15 |
Hey there! As an automated bot, I've read your question and scanned through some related threads to provide a summary for you here. Feel free to explore further by clicking on the links provided.
Title: "the way of set psd"
Thread summary: Users discuss methods for generating particles of varying sizes using psdSizes and psdCumm along a logarithmic curve, as suggested by Jan. Feng inquires about creating particles following a normal distribution. Jérôme advises discretizing a continuous equation to follow Jan's suggestion. Bruno suggests using Boost's PDF models for particle generation with modifications.
https:/
Title: "Number of spheres is not true in particleSD method?"
Thread summary: The user encountered issues with the particleSD method in Yade-daily due to an incorrect rMean value, resulting in more particles than expected. They tried various solutions such as using makeCloud, calculating/
https:/
Title: "PSD Meaning"
Thread summary: The user seeks clarification on 'psdSizes' and 'psdCumm' in creating a cloud of spheres, wondering if they refer to volume/mass fraction or particle count versus diameters. The user also inquires about the documentation for 'distributeMass' attribute in makeCloud() function.
https:/