[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 1.10 MB, 205x223, Best Timeline.gif [View same] [iqdb] [saucenao] [google]
9230020 No.9230020 [Reply] [Original]

So Im working on setting up my solar system generator for the galaxy map that I have. Part of the process involves setting the "range" and "intensity" of a point light depending on the star of a solar system generated by the star generator.

I am using this chart as a reference for the star generator... http://www.enchantedlearning.com/subjects/astronomy/stars/startypes.shtml

What I need is a way to realistically represent the brightness of a star at d (distance) from star. At point light in Unity has 2 key properties. Range, which represents how far the light will travel before it doesnt affect geometry anymore. And Intensity, which represents how bright bright the light at the origin is. I believe the Inverse Square Law is used to calculate how intense the light is at x range from the light.

My current idea in solving this problem involves using the luminosity of the star (value based on chart) and then normalizing and scaling it between 0-8 for the intensity. But the problem comes for setting a realistic range.

What would be some possible ways to establish a good baseline for setting the light range?

>> No.9230023

Also, a point light's range is limited to a maxium of 8 and a minimum of 0.