Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

DarkBASIC Professional Discussion / Making terrains with REAL USGS data, part 2

Author
Message
Sigh
18
Years of Service
User Offline
Joined: 26th Dec 2005
Location: The Big 80s
Posted: 25th Oct 2007 03:34
I wonder if it would be feasible or not at a decent FPS though. One thing you need to remember is that you'll be pushing polys to the video card all the time, and doing that can bog down your FPS unless you do it in batches.

Does DBP natively send polys to the GPU in an efficient manner, and if not, how plausible is writing plugin code for DBP to allow you to do so?

The Great Nateholio
<img src="http://ixeelectronics.com/Nateholio/Pictures/Sigblock.PNG">
jason p sage
16
Years of Service
User Offline
Joined: 10th Jun 2007
Location: Ellington, CT USA
Posted: 25th Oct 2007 03:51
Yeah I read enough about ROAM from your link to know its really cool (and Potentially really cool) but is pretty extreme the way I saw it described. Real Time Poly optimization, REcycled Triangles (Nice to be able to CAP it though) constantly manipulating the mesh(s) Cutting Where Culled - multiple meshs at once - using demand Queues and stuff to "Render" the right ones - WHOA!!!! I want to make a game - not a living organism! hahaha

Seriously - it looked for the most part beyond me. The basics make a lot of sense - the level of skill required to do that - don't have it yet - and the fact the main dude there says he's constantly trying to convince video card makers to add some hardware rendering for the hardware lists (of polys) so they are bigger than what is available now - tells me its WAY high end!

DB Pro's underlying architecture - I don't much about - concerning how fast/efficient it feeds polys to the video card. I don't know. I would THINK that because DirectX is the REAL go between from the software to the hardware - I would hope it has a certain amount of optimization so you tell it everything you want to do - and it does what it can to pump it into the video card as fast/effiecnt as possible - but I truly have no idea.

Sounds like you, Sigh, know quite a bit about gfx in general eh?

Sigh
18
Years of Service
User Offline
Joined: 26th Dec 2005
Location: The Big 80s
Posted: 25th Oct 2007 04:38
Nah, I just read a lot - yep, I can read

My main area of expertise is hardware, I mean looooow level stuff such as CPUs, DMA controllers, FPUs, etc down to the gate level.

You dont push polys to the GPU "as fast as possible" per se. Its more as efficient as possible. One large write to the GPU is quicker than a bunch of small ones. With that said, its better for you to send "triangle strips" to the GPU rather than a bunch of individual tris. There are a number of reasons for this, but the one most people here would understand is:

When you send a single tri to the GPU, youre sending (to make it simple for our explanation) nine pieces of data - the XYZ coords for each vertex. If you send triangle strips, you can reuse vert data resulting in less data being Tx to the GPU.

Heres a link for ya http://everything2.com/index.pl?node_id=1470544

The Great Nateholio
<img src="http://ixeelectronics.com/Nateholio/Pictures/Sigblock.PNG">
Visigoth
19
Years of Service
User Offline
Joined: 8th Jan 2005
Location: Bakersfield, California
Posted: 25th Oct 2007 07:47 Edited at: 25th Oct 2007 08:59
@jason
For the "google earth" scenario, we'd need a server. Or, a real big amount of hard drive space on the client, if we use the resoulution we are using. I'm pretty sure I can make the grids on the fly. The thing that takes time is extracting out the individual tiles. So, if we split them up ahead of time, all we need from then on out is a positioning scheme, which I already have with the terrain naming scheme. There is no reason that the terrain data files can't be preprocessed. I just do it run time in the demo so I don't leave them on the users machine when they exit. So, my thinking was, get a real big chunk from USGS, run it through the file utility to break it into the small chunks, put it all on a server, and load and unload as needed. I haven't tried ANYTHING yet with DBPro network commands, yet, but I do have experience writing server side software in VB. It could handle the incoming and outgoing messages. Something down the road....but something I do need to do. I think a little test app is in order....

edited:
Jason, here is a dbpro project that demonstrates that yes, you can load the terrains from the already split up terrain files. It takes my machine about .2 seconds to do it, so, in game, it might cause a slowdown, not sure yet. Also, I compare to loading a high poly DBPro sphere. The sphere loads in about .015 secs. I expected that, because of the file read, and probably the setting the normals. Try it out.
edit again:
yup. if I don't set the normals, it knocks the load time down to about .15 seconds. So, there is room for improvement, but in the end I think the file reading will be the thing that slows it down the most.

Attachments

Login to view attachments
Hoozer
17
Years of Service
User Offline
Joined: 8th Sep 2006
Location: Bremerhaven (Germany)
Posted: 25th Oct 2007 18:03 Edited at: 25th Oct 2007 18:06
Hello everyone,

I read a lot of your posts via my e-mail-notification and I have to say "Keep it going!", because it sounds very interesting what might result from all this ideas! For me it sounds like it could be somekind of "perfect" real-world-terrain-modeling for DB-Pro or it can inspire for further ideas!

You may ask yourself: Why he doesn't jump in and help with developing this "thing"?
The answer is, because:
1. I'm actually still try to find a job
2. and have to reinstall my machine in the near future (because it is messed up too much)! I want to give Vista a chance and therefore I have/had to search the web for proper drivers ("official" BT878 analog TV-Tuner-Driver for Vista is "impossible" (no official driver found, but I "modified" the old Win2000-drivers to work with my TV-application and to be able to use the remote-control) to find a 64Bit-Driver might be even worse (I have to do a test-install and play around with the drivers for a while)!)!
3. And the last reason is, I'm too bad for this "super-high-end"-stuff!


I really hope to see/read more of your ideas and maybe some "demos"!

Hoozer

AMD64X24800+(939);2GB;GF6800LE (@12PS, 6VS; 380 MHz, RAM: 434 MHz)
DP-Sw-Mode-Comp-Entry (updated to V. 1.4):
http://forum.thegamecreators.com/?m=forum_view&t=109846&b=5&p=0
Sigh
18
Years of Service
User Offline
Joined: 26th Dec 2005
Location: The Big 80s
Posted: 25th Oct 2007 21:03
@Visigoth

DBP really needs a way to load objects a bit at a time so you dont get a momentary hit on performance every time you load an object.

I think thats the best way to do things such as loading files, terrain processing, calculations for shadows from static objects, etc. Just do a few calculations every cycle, because stuff like I mentioned almost always can be gradually calculated/done because it either slowly changes or you know enough in advance to be able to preload/preprocess.

The Great Nateholio
<img src="http://ixeelectronics.com/Nateholio/Pictures/Sigblock.PNG">
jason p sage
16
Years of Service
User Offline
Joined: 10th Jun 2007
Location: Ellington, CT USA
Posted: 26th Oct 2007 23:37 Edited at: 27th Oct 2007 03:00
@Visigoth - Thanx for uploading that! I'll take a look. Honestly - the last two - maybe three weeks I have been experimenting - reading, studying you code, internet browsing etc - trying to get a really good grip on all this. Basically - I'm still worried that with the precision we have available - without a really "slick" positioning system - USGS "Tiles" (like 3000x3000 snips of terrain) won't line up 100% BECAUSE they give us in the header the Lat+Long of the Lower Left Corner - but depending on floating point accuracy etc - I can envision 1000 mile long "missing tile" Seams where the need to be humanly nudged over somehow.

Going into the weekend - I'm going to TRY to make a small loader - not even in DarkBasic - more or less a preprocess - to read "USGS" *.hdr and related "heights" from the FLT file, and position them accordingly - so when I "Load-em" the two "Tiles" are merged in the right positions.

The idea is make a insanely huge file that is TOTAL WIDTH/10-metter Chunks - Not the whole world - but a 1000x1000km square if desired - Then "Overlay" the USGS data in the correct spots in the file - then modify geomapper (yours) to be able to go through it and make a disk based database of "X" files. With a naming convention that makes them readable/usuable via some sort of "auto function" where you just give it a coord - and "Bing" it returns the correct file name.

Again, I've done BIG disk "file" databases before - and it requires directory "Chunking" so you can retrieve quickly and not overwhelm OS with too many files in a single directory.

Like C:MyDirN123.12345654_x_W23.45323.x would be stored something like :

C:MyDirN123.12345654_x_W23.45323.x

I have routines to "Save" and "Retrieve" file names in this manner - automatically making the DIR on save - and dicing the desired file when about to load etc.

[edit]Due to how floats behave - the format "Names" won't be floats like I showed in the chopped dir example. I will likely have a BIG GRID numbering system.. and with a BIG TILE would be another complete grid system with it.(to keep number size manageable and to allow integers to reference specific tiles.)[/edit]


Any ideas people may wish to contribute?

Visigoth
19
Years of Service
User Offline
Joined: 8th Jan 2005
Location: Bakersfield, California
Posted: 27th Oct 2007 02:20
@Jason,
I didn't want to talk about this just yet, but, I have a method for "scrolling" very large terrains. Right now, I can manipulate over 4,000 tiles. I should have another demo up probably late tomorrow.
jason p sage
16
Years of Service
User Offline
Joined: 10th Jun 2007
Location: Ellington, CT USA
Posted: 27th Oct 2007 02:57
AWESOME Visigoth! Great!

Did you think my idea of using your system as part of a MASS Processing tool - to make a terrain DB is a good approach?

Did you think I'm wasting my time trying to make a huge "GRID" made of multiple USGS downloads and "Positioning them like a puzzle mosaic, and then chopping them up into "Mesh tiles"?

I only ask because our goals seem similiar - and you definately seem to be ahead of me on this one - I'm personally persevering because I think the results could be astonishing if seen thorugh to fruition. you definately inspired me on this.

Sigh
18
Years of Service
User Offline
Joined: 26th Dec 2005
Location: The Big 80s
Posted: 27th Oct 2007 05:29
Must...stay away...from...terrain....

Ohhh, I give up

Now I gotta try and implement an idea related to this.

The Great Nateholio
<img src="http://ixeelectronics.com/Nateholio/Pictures/Sigblock.PNG">
jason p sage
16
Years of Service
User Offline
Joined: 10th Jun 2007
Location: Ellington, CT USA
Posted: 27th Oct 2007 15:17
@Sigh - Atta-Boy

I thought I should post this link as it has QUITE and extensive explaination of the Latitude/Longitude measuring - ploting bearing, and various calculations - with each having a strong point - one of them even has accuracy down to a millimeter! (That's just scary)

Any way: http://www.movable-type.co.uk/scripts/latlong.html

@Visigoth - I can't wait until I can see your 4000 tile implementation - what's that - 4000 tiles total or across and Down?

I'm shooting for an area - around 74012 Tiles Across by around 55600 tiles! Though - your stuff works - I'm still trudging through "global accuracy" issues This is why I'm perusing the Long/Lat Calculations - I want to get a database "Schema" Set up so I can plot a reasonable layout that will work regardless of the position on the globe.

Sigh
18
Years of Service
User Offline
Joined: 26th Dec 2005
Location: The Big 80s
Posted: 27th Oct 2007 20:11
So far Ive resisted my urges to code for this - this stuff, while very tempting, needs to stay on the back burner of my project.

I did break for about 5 minutes, lol, though. I was coming up with a standard for "tile" serving terrains in DBP.

What I drew up was the following:

There would be two general formats a server query would return - integer and FP. In my project there isnt much of a need for the fractional part of an elevation as everything (for terrain elevation) is rounded to the nearest foot and permanently set that way. This can save on the amount of crap you have to push through a network.

The server responds to a terrain query with a byte containing various status flags; a byte indicating the "bits per index"; a byte containing the "bits per elevation"; a value (byte or word) that indicates the number of elevation points along the X-axis; and a value (same size as above) that indicates the number of elevation points along the Z-axis; all this is followed by the elevation data. So it looks like this:

Flags8
BitePerIndex8
BitsPerElevation8
XIndex8/16
ZIndex8/16
ElevationData8/16/32


Flags are defined as follows:
Bit 0 - Integer/FP (IFP)
Bit 1 - Relative Measurement (RM)
Bit 2 - Compressed Datachunk (CD)
Bit 3 - Stripped Triangles (ST)
Bit 4 - Meters/Feet (MF)
Bit 5 -
Bit 6 -
Bit 7 - Escape Extension (EE)

Integer/FP - Set if this datachunk contains elevation data that is in floating point format (4 byte). Cleared if it is integer format (1 to 4 byte).

Relative Measurement - Set if this datachunk contains data measured from a relative elevation (i.e. relative to the last query the client presented to the server). Cleared if the datachunk contains absolute measurements.

Compressed Datachunk - Set if this datachunk contains compressed elevation data, cleared if not. Useful for things such as plains where elevation data can be the same over a group of measurement points.

Stripped Triangles - Set if this datachunk contains only triangle strips, cleared if not. Triangle strips reuse some vertex data from the triangle before (adjacent to) it, resulting in fewer verts needing to be sent. Use triangle strips for meshes that are organized in only a grid pattern, not meshes that have been processed by a poly merging/reduction function (resulting in mesh structures not easily divided into strips)

Meters/Feet - Set if this datachunk is measured in meters, cleared if in feet.

Escape Extension - Set if this datachunk is formatted differently than normal (i.e. to allow for using a number of different formats - basically "We didnt have enough flags and variable space with the first versions of this format so we added a bit to allow more room later")


Its important to note that most of these parameters are set by the server depending on what will result in the least amount of network traffic. Most of this seems like it would only be a benefit to integer numbers though because FP are always a minimum of 4 bytes (correct me if Im wrong, but Im used to FP being at least 80-bits).



Server functions initiated by a client query:


Request Chunk
Requests a datachunk from the server

On entry:
Flags - bit 0 set if XPosition/ZPosition are FP, cleared if integer. Bit 1 set if Radius is FP, cleared if integer. Bit 2 set if values are in meters, cleared if feet
XPosition - the position along the X-axis the chunk is to be centered
ZPosition - the position along the Z-axis the chunk is to be centered
Radius - the radius of the measurement


...and thats as far as I got. I started because I think this is a good opportunity to write a "standard" for DBP apps concerning terrain serving. Plus, if you never come up with a standard for people to PnP, they could just pass up your code in favor of their own or someone elses.

The Great Nateholio
<img src="http://ixeelectronics.com/Nateholio/Pictures/Sigblock.PNG">
jason p sage
16
Years of Service
User Offline
Joined: 10th Jun 2007
Location: Ellington, CT USA
Posted: 27th Oct 2007 20:46
Hmm... I'm very glad you took an interest Sigh but I think your solution may be a little overdone. There are enough complexities with global positioning, measurements, plotting, and making the "Grids" merge nicely. No I don't think my solution is the best - hardly - but I think you should consider the following:

Float datatype used in the USGS is only 4bytes. (Switch to integer saves nothing. Colapsing bytes here and there - could be beneficial - but I think clock cycles to pack and unpack would about balance it out)

Zip/PkZip algorythms work pretty quickly and do a much better job than ZMODEM "compression" which is based on the type of algorythm you're suggesting to save bandwidth (Based on repeating values).

Triangle Strips - and Trianlge "Fans" are fast - DarkBasic kinda handles the "Your Mesh" to "Direct-X" communication for you AND - "Triangle Strip" "Triangle Fan" vertice info can be created on the fly from the height data - and this would be a good approach for directly programming DirectX or OpenGL. Preprocessing it - when it can't to my knowledge be processed by DarkBasic Anyway, doesn't make sense to me.

The closest thing we have to a standard is the "flavor" of the data given/attainable from USGS. After that - I see two options. Use their data - and make the Meshes in the program on the fly - like Visigoth has demonstrated - or make a database of meshes at various resolution - and load and destroy at will.

Visigoth
19
Years of Service
User Offline
Joined: 8th Jan 2005
Location: Bakersfield, California
Posted: 28th Oct 2007 01:37 Edited at: 28th Oct 2007 01:41
@Jason,
well, just got on the pc today. Plan on coding into the wee hours. Anyway, I do have a problem with .x files. They take too long to load. According to my tests, to load and position an .x file version of the terrain takes right about 1.0 seconds. If I load it with the .bin file, about .2 seconds. I am including a test of this so you can see. So, updating terrain in real time would be a little slower using .x files. More to come...

@Sigh,
You just gave me a new idea. something totally different, entirely. Thanks.

Attachments

Login to view attachments
jason p sage
16
Years of Service
User Offline
Joined: 10th Jun 2007
Location: Ellington, CT USA
Posted: 28th Oct 2007 01:46
I got similiar results UNTIL I tried DBO (I didn't expect this - Check this out! )

I even tried Swapping Which one I loaded first - this is pretty amazing.

Attachments

Login to view attachments
Visigoth
19
Years of Service
User Offline
Joined: 8th Jan 2005
Location: Bakersfield, California
Posted: 28th Oct 2007 02:44
@Jason,
if this is true....awesome.
Gonna write the changes into the new demo now.
Thank you, I didn't think to even try dbo.
Sigh
18
Years of Service
User Offline
Joined: 26th Dec 2005
Location: The Big 80s
Posted: 28th Oct 2007 04:25
The "optimizations" I posted arent for speeding up the "game" on a machine, theyre only suggestions to try and implement a network side to the system. The things I suggested dont care what you do with the data once its in the client machine, theyre just to get it there from a server in somewhat of an efficient manner (or youd hope so).

If someone could only make a dll based on the network code from TGE....

The Great Nateholio
<img src="http://ixeelectronics.com/Nateholio/Pictures/Sigblock.PNG">
jason p sage
16
Years of Service
User Offline
Joined: 10th Jun 2007
Location: Ellington, CT USA
Posted: 28th Oct 2007 14:42 Edited at: 28th Oct 2007 15:16
Quote: "theyre just to get it there from a server in somewhat of an efficient manner"
Yeah I know what you're proposing - I'm just saying that in order to get that data in that state - and the client side code to "Put it back together" might be a bit much.

I'm currently trying to process about 750x350 miles - at 1/3 arc - and its a lot of data and only is about the size of hawaii and the "Square of ocean" around it. To get a database of the whole world processed in such a way that the "Zmodem'ish" protocol you recommend would be a lot of work to "Set up" - The client putting it back together would probably not be that bad time/cpu wise - but trying to convert the raw data from USGS probably would be a nightmare.

I'm not even saying it wouldn't work - just a lot of hoops - where I think maybe a more generic (less 3d info specific) compression algorythm might suffice and be easier to implement ... leterally on a global scale

Let get one thing straight - I think your ideas are very very plausible - and would work. But I also think a generic compression algorythm would be much easier to implement and would also compress patterns found in the numbers of even uneven plains/terrain.

That's all.

The problem I'm running into now - while visigoth is making a tiling system work on a larger number of tiles - smoothly - for a demo etc. is that the USGS Data comes in chunks. Look at the Attached picture...[edit]pic in next post .. forgot sorry[/edit] See how there are places with and without data? Both in the TILES you download - and often "NODATA" is just not downloadable. (Ocean depths/terrain is probably another kind of resource altogether) The problem I'm facing now is that Visigoth's tiles are based on 100 (or less for lower res) and for even seams, he grabs 101x101, but increments the "tile top left corner" by 100. This is perfect - we've all seen it - but it becomes a problem as one USGS FILE meets another. They aren't in 100x100 chunks. They are quite arbitrary.

I've been trying to work out various solutions. One - the Bohemoth "work file tile" is to make a perfect grid tile - that is rather large, and disivisable by 100 (+1) both directions, and trying to plot all the data you may have from various satallite downloads in their appropriate spot. this gets complicated by the longitude+latitude coordinates, and an elipsodial world. Not to mention making the tool smart enough to handle (seamlessly) putting the right half of a USGS FILE that didn't fit in one of these "Work file tiles" and putting the remainder in another.

I don't want to work on this forever - so I might just Settle.

With the first strategy/plan - picking a behemoth file that is bigger than the area I want to map - so all the tiles fit in it. My tests so far with this method involve creating and empty 16 giga byte file - for just hawaii! Naturally, in this situation, one the hi, medium, and low res tiles are created, this big work file can be deleted - but processing it is SLOW.

The other "plan" is to settle on one tile, and process the number of meshs that fit across and down, disivibable by 100, by making a bunch of tiny bin's files that are literally 101x101 each - then a DBPro util based on visigoth's code would then turn these to the various "depth" meshs.

@SIGH - I'm curious if you can come up with an easier to implement network protocol for transferring packed dbo's and textures - as this seems to be the fastest to process as far as DBPro/DarkGDK are concerned.

@Anybody - Curious what you think of these approaches with the hope that you have a brainstorm idea that would simplify it (And be "Recallable" into a program somehow - or a completely different approach.

My reading on the subject so far about how flight sim's do it - is along the lines of the bohethmoth monster tile for preprocessing. But they make the MONSTER FILE - and then "Touch it up manually" (Imagine an editor that could load "Squares" of 16gig files? MANY SEEKS! Couldn't be that fast I'd imagine). They have to touch it up manaually because sometimes the satelite files just are missing data or they started a new "SCAN" as they flew around the earth in a slightly different spot - so its not as seamless as we would like.

I'm glad you took some time to look into this Sigh - I just think we need a few more minutes to get something that is scalable

jason p sage
16
Years of Service
User Offline
Joined: 10th Jun 2007
Location: Ellington, CT USA
Posted: 28th Oct 2007 14:43 Edited at: 29th Oct 2007 00:32
Sorry - the picture - (If you upload with an edit the forum doesn't display it right unless you download - hence double post)
[edit] moved update to new thread so email readers got it[/edit]

Attachments

Login to view attachments
jason p sage
16
Years of Service
User Offline
Joined: 10th Jun 2007
Location: Ellington, CT USA
Posted: 29th Oct 2007 00:34 Edited at: 29th Oct 2007 02:42
Update:

I think I have a workable database/directory layout that will work. Its simple enough, and I have a utility that can create the structure, and routines that can turn a Lat/long pair into the correct PATH to get a tile. (The part where you can turn a lat/long pair into the correct path will need to be ported to DBPro/DarkGDK as well).

The Structure looks like this: e:/World/N/18/W/155/53

E:/World/N <-- Complete North Hemispere
E:/World/S <-- Complete South Hemispere

E:/World/N/??? <-- ABSOLUTE Latitude (Positive # always)
E:/World/S/??? <-- ABSOLUTE Latitude (Positive # always)
(total 90 directories, 0-89 degrees lat each)


E:/World/N/18/E <-- Complete Range of Longitude 0-179 East of PM
E:/World/N/18/W <-- Complete Range of Longitude 0-179 West of PM

E:/World/N/18/E/??? <-- ABSOLUTE Longitude (Positive # always)
E:/World/N/18/W/??? <-- ABSOLUTE Longitude (Positive # always)
(total 180 directories, 0-179 degrees each )

The hardest part is the last directory name, as its a calculation. This calculation makes sense given Visigoth's 1km tiles, however, allowances for "Overlap" have been designed in to allow (hopefully) seamless transitions from, for example latitude 18 and 19, and for example longitude -155 to -154. We are only talking about a Kilometer or less.

It works like this: Latitude doesn't vary enough to matter between degrees. Each latitude degree is always 111.18 km apart. This, unfortunately gets more complex when discussing longitude (Think segments of fruit in an orange - thinner near the ends than in the middle). Ok - so longitude aside - the calculation has to do with LATITUDE only.

LATITUDE distance from Equator to 1 degree = 111.18km. So I Rounded up - to 112. Take ANY LATITUDE DEGREES, and LOSE the WHOLE number portion. DIVIDE that by 0.0089285 (Which is ONE divided by 112) and you will get the proper DIRECTORY. 0 thru 111

That's the storage structure.

For the DATA - I've decided I just need to do some serious coding - no real shortcuts, because the bohemoth idea has to many limits, the slicing and dicing just one download from USGS will just mean no seamless terrain - and limited contiguousness.... The best way I can think of to do it is READ a SATELITE USGS download, and CALCULATE the LONG/LAT for each "pixel" (or height) they give me in the 1/3 arc download. The idea is simply to make little bin files like in the sample Visigoth gave us to show loading times. This file size is exactly 101x101 floats (100x100=tile, the extra is for seamless merge with adjacent tiles) This works out to exactly 40804 bytes per bin file. THESE are the files I will make from the USGS downloads.

I have decided "Making smaller" tiles at the edges is not what I'm going to do. What I will do is make a tile - regardless if the data can all fit acroos or down - so that a place holder is there - one for a mesh to be made - two - so consectutive downloads can "Fill in the blanks".

I've decided this approach is best because:

1:minimizes disk usage to available terrain data as it should be

2:Keeps Visigoths proven algorythm intact

3: will make in game calculations (as to what tile to grab) easier - even allowing seamless display when the application "picks a tile" from two different directories because they overlap. As long as the game positions the tiles according to long/lat - everything should line up pretty good - unless you're in the proverbial camera "space shuttle" zooming around.

(Natually culling will need to be implemented - recent findings show that making a util to convert these "Bin" files to Meshs - and ultimately DBO files will probably give the best DarkBasic/DarkGDK performance.)

Next - Writing the Code to glue the mosaic of USGS together now thata storage system has emerged and is in place.

@Sigh - Can you come up with an efficient Networking protocol based on this existing "data layout"? I think double precision coordinates will need to be used - that's only 8 bytes (I THINK) for a given lat, or long - the pair totaling to 16 bytes. I'm currious what you would recommend - sending over the little bin files and having the client make the DBO's, zipping dir's up and sending those (with dbo/or bin), or what.

@Visigoth - Does this make sense? I know its a bit of reading - but I actually having been messing with this effort whole heartedly - and I really have been giving it 100%

For the record: This directory structure - installed - makes 14,515,200[edit]I goofed - it's half that 7,257,600[/edit] sub directories - with a look up depth of only 5 tiers, with a maximum count in a single tier being 360 for the longitude, 112 for the kilometer horizontal (lowest tier) and 90 for the latitude.

Visigoth
19
Years of Service
User Offline
Joined: 8th Jan 2005
Location: Bakersfield, California
Posted: 29th Oct 2007 03:43
@Jason,
Wow, you've been thinking, haven't you? Good. I have a couple thoughts. First, the source data. 1/3 arc second data is not available for the entire US yet. They keep adding it as they get it. I think 1 arc sec is. Also, if the plan is to use the seamless site to download the 250mb max chunks, its gonna take a long time to get the data. It might be better to order really big chunks and pay for them. The USGS does do custom data, for a fee. I can look into this some more.
The client; how many tiles would be displayed at one time? With the demo I will post a little later, you can set different layouts, like 5x5 or 8x8 or 20x20, if you want.
What happens with continuous terrains if move a long distance? If we just keep adding terrain, eventually we'll be getting into really large numbers for position. I don't know if this will be a problem or not. Will definitly have to use negative numbers, at the least.
One last thing, do you have any messenger service? If you want, maybe we can share ideas that way also.
Other than that, I think its great you are working hard at this, keep it up!
Sigh
18
Years of Service
User Offline
Joined: 26th Dec 2005
Location: The Big 80s
Posted: 29th Oct 2007 04:00
One thing you need to decide before you even think about doing more work is - what is the target application of this system?

How you implement something will depend on its intended use, whats good for a MMO/game wont be good for a program like google earth.

The Great Nateholio
<img src="http://ixeelectronics.com/Nateholio/Pictures/Sigblock.PNG">
jason p sage
16
Years of Service
User Offline
Joined: 10th Jun 2007
Location: Ellington, CT USA
Posted: 29th Oct 2007 05:15
@Visigoth - I'm on Yahoo IM alot - everyday at work - and once inawhile on the weekends. On the Weekends I check my Email ALOT - staying intune with forum developments while I'm coding - I get stuck - need a code break - I log in - Both ways of reaching me are public - and in the little "Email" and "Yahoo" buttons under each of my posts.

Your thought about the source data are good ones. To date - I've been basing everything on the 1km tiles you have devised for the 1/3 arc. If possible, I'd like to stay with even as possible numbers.

If you look at both headers - 1/3 arc and 1 Arc - they both have the "Cell Size"

1/3 Arc Sample CellSize: 9.25925926e-005
1 Arc Sample CellSize: 0.00027777777779994

You stated in one your posts the 1/3 arc second was approximately 10 meters. Are these CELL SIZES Metric? Or English Measure? (In miles) I assumed it was all metric. If you know for sure - please do tell.

If its possible - without glaring errors - I'd be nice if we could read the data in such a way we still end up with like sized tiles (Preferred to get something running quicker) - or multiples, like 1/3 arc second data = 1km square (100x100 verts like you are using now), and 1 arc second (100x100 verts, but 2km square) or something like that. I really desired - it mathematically/logistically possible - to stick with a base size (1km square) for a first draft anyway. Bigger chunks have their place - but I think the directory structure would have to then take into account these differing resolutions - so it can all mosaic together reasonably. (with little/no seams)

Quote: "The client; how many tiles would be displayed at one time?"


Well - I leave this up to the client. Any client could benefit from these meshes. Texturing is going to be a chore - whether shader based, texture based, or some combination thereof.

I figure ANY client using more than one tile - will want some sort of culling, and as far as "How big at a time" - I figure whatever they need. This touches on what Sigh recommended I/WE think about:
Quote: "One thing you need to decide before you even think about doing more work is - what is the target application of this system?
"
And my response to this is - agreed. This definately matters - but at the same time - I'm trying to help build something that can be used for varying purposes. I definately agree the client needs to be DEAD ON with its own goals, presentation, "Engine" to display it etc.

I'm hoping a generic enough "sound" database design can be used a lot of different ways by a consistant interface and support functions. MMO is a broad brushstroke/game type description to try to know if this could work or would be the right kind of thing to use - same for RPG. The reason I feel pretty confident this is generic enough for almost anything - is the units are currently small - 1km squares - available in varying resolutions - (Thanx Visigoth for that) and a game could use 1, 2x2, 3x3, whatever. If the game requires a 20000 ft view, and we don't have data format that suits this requirement easily - than that data could be generated by sampling what we do have - pretty much how visigoth got the three resolutions he showed us in his demo - where he has mohawk valley, some lake (??) and the grand canyon.

I definately see how the CLIENT needs might effect the whole networking approach - like MMO might need a certain "quick" realtime system - or maybe if flight sim'ish - there is an initial client download - for people "Joining an area" to play.

I really don't know yet. My Specialty is system design - and I definately - if nothing else - know how important it is to TRY to have a solid foundation to build on - and if you think too too long - you get nowhere - and if you rush - you pay later. So I'm treading lightly - and doing alot of tests and research.

The biggest snag I have so far - is WIDTH of given "earth" degree squares. I'm not Einstein - I'm not a math genuis - give me a decent forumla though - and I'll write the software around it and it will work. I've tried various formulas - haversine, vincenty is rough - I've tried them - and the results have not been great. I really need a decent code snip (formula) so that at a given latitude - I know how far apart the longitude radials are. I thought of using a "Table" of data - (90 different values minimum) and I may just do that to put the results of functions etc in stone and not leaving much room for deviances - but I think a function might be more effiecnt. At 0 latitude - radial width is 111.18km - and at 89 latitude its like 1.9km... and its not "even" - elipsoidal shape of earth causes issues

So - I think we're all going to get there - and I must say - I have not found a simple FAST way to do this - I think its one of things where you just need to slow down, break out google, a calculator, paper and pencil - draw pics, diagrams, and figure out a halfway decent approach.

I'm very much looking forward to your demo Visigoth so I can be reminded what this is going to potentially look like. Additionally - Van B has modified Green Gandalf's shader to have as many as 6 textures represented in one tile - where a color map makes the decision on what texture and how much of it to show I believe. I THOUGHT I read about one that used the NORMAL angle - that would be great - then - we could possibly set up 1 fancy shader - apply it to all tiles and they would texture themselves in a high detail - reasonable manner - hopefully not to heavy on FPS - defaintely not a memory hog doing it that way....

Getting tired - work tomorrow - ugh... need three day weeksends!

jason p sage
16
Years of Service
User Offline
Joined: 10th Jun 2007
Location: Ellington, CT USA
Posted: 29th Oct 2007 05:26
[sorry - I need to stop doubling.. but I had to answer this question from Visigoth]

Quote: "What happens with continuous terrains if move a long distance? If we just keep adding terrain, eventually we'll be getting into really large numbers for position."


Contiguous. Well - the dir structure I set up actually gets down to the one TILE HIGH (1km) horizontal ROW - and the number of tiles in this row is to be based on the longitude radial distance.

Because the radials are AT MOST 111.18km at the equator - the highest number you'll have for positioning is simply: 112 (MAX... MAYBE 113) and these HIGHER number's are only to allow "seamless" overlap. That is the very adjacent tiles may overlap each other.

How does the client handle this? If you're traversing a ton of terrain - it would be loading and deleting a circle (or square) around the camera. Need more fligth sim'ish? Than we really need to make 1 arc second and possibly much lower res - but LARGE tiles - so from high up - we can still present reasonable terrain without needing 10000 tiles loaded or some ridiculously large number of them.

ok - NOW I'm going to bed. Night people.

Visigoth
19
Years of Service
User Offline
Joined: 8th Jan 2005
Location: Bakersfield, California
Posted: 29th Oct 2007 10:54
ok, demo is done, but, if I put the .flt file in with the project, it exceeds the file upload limit here. So, I think I'm gonna tweak it just a bit more to make it more generic, so you can download your own big .flt file and run it. So, have a little more patience, its like 1:45 am here now, and I have to work at 7:30, so, I'm gonna quit for now, and put something up hopefully tomorrow.
But for now, here is what the demo does.
It takes a really large .flt file, and then splits it into the .bin files. In my case, the .flt is 7400 columns by 5400 rows, 167 mb, so, it could still be bigger. I end up with 4218 .bin files. Then, it makes the .DBO files, 4218 High detail ones, 4218 medium levelones, and 4218 low level ones. This process takes some time, of course. Also, before you even attempt this, make sure you have alot of free hard drive space, like more than 15 gigs, to be safe, because of the way Windows reserves some HD space for swapfiles, I think could be a problem if you are low on space.
After the .dbo files are made, the app will launch. It will display a 4x4 grid of tiles. This can be altered in the code for more, if you like. In the app, you can change the LOD, and you can scroll the 4x4 grid across the full 74x54 that are available. The grid actually stays still, it just loads and kills the .dbo's and repositions as necessary. There is some camera movement commands so you you can explore the terrain. I did not put in walking code.
I think, after all this, this may not be the best way to do this. I hated having to wait for the .dbo files to be built. The .bin splitting is very fast compared to that. And compared to the other method, making the tiles on the fly, the scrolling isn't really that much faster, plus you end up with too many files.
So, when I do put this one up, it is NOT FOR THE FAINT OF HEART. It will make your pc work. But only the preprocessing stuff. Once everything is built, it runs like butter.
So, hopefully, tomorrow (tonight? What time IS it again?)
jason p sage
16
Years of Service
User Offline
Joined: 10th Jun 2007
Location: Ellington, CT USA
Posted: 29th Oct 2007 19:49
Ahhh - good progress Visigoth! I had a revelation of my own.

The file system I made should not be "just created" and then filled. I think it would be much "Nicer" for NTFS if I only created the directories on demand. This way, as the USGS flt files are being read - the "Saving" of actual tile data creates only those directories that will get actual files in them. This should cut down things alot because "Oceans" don't need directories unless doing underwater terrain - and frankly - this system could work for that too.

In Short - Have you ever asked windows to delete a directory with 7.5 million subdirectories strewn through out? It's not PRETTY! Everywhere I read the file structure limits for NTFS - the file system I came up with barely scratches the surface of any kind of limit (That's what all the sub directories prevent.... very bad having to much "files" or "Dirs" in a given directory.. but nested is ok).

The "disk based" systems I've made in the past did the on demand thing - and never had a flaw - and they were only as complex as the amount of data being stored. More data? More complex dir tree. But easy to manage.

I know you don't like the DBO file creation etc - but fear not. You still are writing demos for the general public to view/play with. That kinda is the problem - in that - there needs to be TWO applications or intelligence added - or a menu option.

1: Utility Portion - Make the DBOS - One Time

2: "Viewer" Program - lets you walk around etc providing you made the DBO's first.

Making the app smart enough to know "First Time? Please Wait" etc. would be a good enough way to do it and would help.

In my little GEOMAP utility I'm working on - I'm making it smart enough to Sense new Satelite info, and just process it, and then delete it (Though delete is shut off until I know its flawless so I don't have to download info twice yada yad)

But after our benchmark results - I think DBO is doable

Sigh
18
Years of Service
User Offline
Joined: 26th Dec 2005
Location: The Big 80s
Posted: 29th Oct 2007 23:32 Edited at: 30th Oct 2007 00:00
I havent looked at how DBO files are structured yet, and dont plan to any time soon due to my project. Im willing to bet that they arent optimized for transfer over a network any more than .x files are.

With that said, the best thing to do (and I shouldnt have to say it) is transfer files in some sort of binary mode, with transfers occuring in the background, perhaps a few strips of a tile each frame. If you dont do it that way youll most likely end up with network bog affecting gameplay as a tile is being transferred from server to client.

Ill just use the 101 x 101 tile size in an example -

Assuming the worst case:
3 verts per triangle multiplied by
2 triangles per square x
101 squares per tile strip x
101 tile strips per tile = ~60,000 verts per tile
Multiply this by 4 bytes per vert = 239kB of data

Now lets assume that we are using a 56k net connection, at best, if we are Tx 56kB of tile data/sec, we can transfer the tile in about 4 seconds.

Thats quite a burp in a game enviornment while the system waits for a tile from a server.

If we break it up into strips of 101 squares, the amount of data per transfer is reduced to about 2.36kB. Now we transfer one strip each frame, every other frame, or whatever. At 30FPS, loading a strip every frame, we can load an entire tile in about 3.3 seconds.

Even so, there isnt any real need to transfer a tile that fast, so we can further divide our transfer operations, perhaps half or third of a strip each frame.

Now on to the textures you mentioned.

Textures would need to be mip-mapped, perhaps down to 1/8 or 1/16 resolution for quick transfer. Lets take a 256 x 256 x 32 uncompressed terrain texture:

Various mip-map levels -
mip0 = 256kB
mip1 = 64kB
mip2 = 16kB
mip3 = 4kB
mip4 = 1kB

Of course you wouldnt use mip0 textures anywhere but near the camera, perhaps in the immediate 9 squares surrounding the camera.

Again, textures can be transferred a row at a time.

Forgot one thing -
To build on this even further, you can specify vertex colors for terrain which is at a good distance from the player, instead of using textures. This gives the terrain tile the correct color, which is all you really need for something that is afar.
The verts would be colored the same color as the average color of the texture pixels nearest them.


Mip-mapping is something that SHOULD NOT be done in DBP, use assembly instead to make use of MMX/SSE architectures and instructions. But you could have the server use precomputed mip-maps....

Now Im sure someone is thinking well nobody uses dial-up nowadays, weve got all the bandwidth we need with broadband. This may be true for the most part, but you still have to plan for the worst (or at least bad) network conditions. Also, thinking like this is how software, among other things related to computers, gets bloated beyond what it really needs to be (Windows, media players, text editors, etc). In my opinion, bandwidth is best saved for updating player and object conditions/positions/etc as they are of immediate concern to the player. Pretty terrain is useless if the player cant aim and shoot reliably because the system is waiting on data transfer, or said transfer hogs bandwidth.

The Great Nateholio
<img src="http://ixeelectronics.com/Nateholio/Pictures/Sigblock.PNG">
jason p sage
16
Years of Service
User Offline
Joined: 10th Jun 2007
Location: Ellington, CT USA
Posted: 29th Oct 2007 23:59
@Sigh - I for one appreciate that post. Good stuff to be thinking about on every point made

Visigoth
19
Years of Service
User Offline
Joined: 8th Jan 2005
Location: Bakersfield, California
Posted: 30th Oct 2007 03:59 Edited at: 4th Nov 2007 09:14
@Jason,
funny you should mention the "do it once " option. I learned that lesson last night. I was just commenting out the code after I made the tiles. Well, I forgot to comment it out and reran, and, well, 16,000 files later...
So, now it checks to see if the files are already created. Anyway, here is the demo. Like I said, it WILL take up space, and it WILL make your PC work.

ReEdited:
new project file. Now includes a 32mb .flt file, makes a large grid 36x22 tiles. Added ability to resize the display grid at runtime. Added color shading.
It wouldn't let me upload it here, too big I guess, so its on my website:
http://www.st-rider.net/downloads/demo/TerrTestProj.zip

one more edit:
replace this function with this:


another edit, I found a small bug that will not allow you to create the .bin files, because I didn't create the directory properly. Its right at the beginning of the code:

jason p sage
16
Years of Service
User Offline
Joined: 10th Jun 2007
Location: Ellington, CT USA
Posted: 3rd Nov 2007 16:51 Edited at: 3rd Nov 2007 19:23
Latitude with NORTH Entries, North of the Equator have POSITIVE Latitude Values up to +90.

Latitude with SOUTH Entries, South of the Equator have NEGATIVE Latitude Values down to -90.

Longitude with WEST entries of the Prime Meridian have NEGATIVE VALUES Out to -180.

Longitude with East entries of the Prime Meridian have POSITIVE VALUES Out to +180.

In an attempt to make this as seamless as possible, based on 1km Mesh Tiles, the following has been set up:

Each DEGREE SQUARE (lat/long) has been divided into a CALCULATED number of tiles across (see chart below), and 112KM

(111.19492664455873km) tiles down. This isn't perfect, as the tiles are 1km squares, and the world isn't square for starters.

There will be overlap. Hopefully it won't be to dificult to make a terrain engine display the "overlaps" together as needed as

they should meld nicely if the data used to create the meshes is reasonably intact. I also have given thought - to other ways

of presenting the information - like tiles that are low resolution - but cover larger square area than 1 km for higher altitude

views.

My first charge is to try to get this working. Once it works - I'm hoping I can take a solid code base and make it more

flexible via changing it to consider the scale automatically via function parameters. "Give me TILE Filename and X,y coord for

Scale Factor X" or some such business.

Some more #'s - Each Latitude Degree contains in it: 111.19492664455873km
Therefore Each Degree (or ONE) divided by this above amount gives you: 0.0089932160591873057074081366006868 Degrees per km.

The HTML File "Calculate Longitude Radial widths.html" that's in the DOCS directory is actually a hacked up web page - whose

source remains intact. (Original: http://www.movable-type.co.uk/scripts/latlong.html)I tried to make my own web

page - but for some unknown reason - the javascript function that calculates the longitude widths at the various latitudes

would choke on the to Radian function (NumberHere).toRad(); but it wouldn't do so in the orginal web page. Perhaps something in

the html header makes the javascript adhere to a particular "version" of javascript... dunno.. don't care (YET).

Note: I tried converting the formula to the language being used to hack up the USGS data and place it in the

database correctly, but seeing how this also failed - I went with an approximation - via a calculated chart. Perfect? No.
But it should be usuable - and improvements can/will be ongoing if interest and over value of this project warrants it

BTW - I don't care about the north or south pole. Santa Claus already has a GPS

You know, using a GPS to Calculate Lat/Long could plot you in a game - this is obvious - but I find it interesting and cool.

So I just copied the web page above, and added a button called "make chart". It gives you this:
----------------------------------------------------------------
Longitude Radial widths at the various latitudes
----------------------------------------------------------------
Lat: 0 width: 111.19492664455873
Lat: 1 width: 111.17799068882648
Lat: 2 width: 111.12718798166408
Lat: 3 width: 111.04253400159948
Lat: 4 width: 110.92405454092979
Lat: 5 width: 110.77178569784836
...
...
...
Lat: 87 width: 5.819419154609052
Lat: 88 width: 3.8805977812483374
Lat: 89 width: 1.9405944300618482
--------------------------------

I'm currently trying to work out a few things:

1: nailing down the Haversine formula - as opposed to using the chart
2: Nailing down a way to get 10meter EXACT plotting globally - so I can maybe make non-square dead on accurate tiles versus overlap
3: Making the "binary splitter routines" (GridFloat download's from USGS (FLT files)) to tiles (for the tile making process)

I know full well - that it might just make more sense to allow a few "Flaws" and move forward then trying to out do google

[edit] Number 1 - Haversine Formula - Woo Hoo - Got it working up to 15 digits of precision! Visigoth! This means we can plot a specific "DOT" Exactly where it's supposed to go - right inthe correct dir, tile, and X,Y Location! Now I just need to "Remove" the Chart... And use the dynamic calculations instead!

Visigoth
19
Years of Service
User Offline
Joined: 8th Jan 2005
Location: Bakersfield, California
Posted: 3rd Nov 2007 19:28
@Jason-
If you can get this Lat/Long thing worked out, that would be awesome. I was hoping someday to have the ability to use GPS tracking data to show places or routes someone traveled. If you get this working, is it something you plan on sharing? I hope so.

Sigh's comments about precalculating and doing this a bit at a time got me to thinking. I REALLY, REALLY think making the tiles on the fly as opposed to making .dbo's is really the way to go. Even in the lowest res format, the .dbo files for each tile is 32k. The actual tile height data is 40k. Medium res tiles and hi res tiles are much larger than 40k (122k and 477k). And that is without any texturing. So, I looked at the functions again to see how I could make the tiles in real time faster, or at least appear to be faster, and here is where I am at. I broke the function down into 5 smaller functions. First, we make the basic grid and weld it. Thats two functions out of the way. Load 1 of these meshes for each detail before we enter the main loop. Give them a number that we would never use in game. Then, when we call the functions to create the meshes on the fly, we just CLONE the ones already loaded. This was a HUGH increase in speed. Now, all we need to do is set the height values and set the normals. And btw, SET OBJECT NORMALS() is slow, and is the whole reason tile sizes are limited to 101 columns by 101 rows. Now, here is the tricky part, where Sigh's ideas come in to play. I wrote another function to gradually fire off the individual functions. Basically, it uses a counter to keep track of how many main loops have occurred. So, you call the function from the main loop, pass in a counter value, if the value = set value, it will set the height values. Since it uses the counter value to check, it skips the next two functions. This has the benifit of almost eliminating the "stutter" you get after you load or create a tile. Also, I'm trying to implement an event type system to control all of this. I'd say I'm about 50 % done to rewriting the last demo with this method, and I expect it to be even faster, and without all the .dbo files.

I think next on my list, anyway, is some network experimenting, and how to send and receive the data files.
jason p sage
16
Years of Service
User Offline
Joined: 10th Jun 2007
Location: Ellington, CT USA
Posted: 4th Nov 2007 04:28
@Visigoth - How are you. BTW - I emailed about - but have managed to find some docs - about something you said - but I was confused about. HDR file "CELLSIZE". I now know its degrees - and that 1 arc sec has a 0.00027778 ... thereabouts MEANS 30meters per square, and the 1/3 arc Second CELLIZE of 0.00009259259600001 = 10 meters - and that both of these have slight rounding - and typical FLOAT Scientific notation. Anyway - thought I'd document this.

This meter thing is important as I'm trying to relate haversine formula to our finest resolution - and build from there. Now I'm convinced about it - now I'll continue.

I Saw you demo - and its pretty neat. Great job. There definately is a base of code now + the util I'm working on - that should make this stuff hopefully easier. I really think we may need to figure out a pretty decent SHADER for texturing the Terrain based on altitude and slope angle - like others on here - (Like I believe VanB and Green Gandalf are doing.) This seems like the most memory efficient way so for to texture it Need to figure out how they over that "DETAIL TILE" on Advanced Terrain. I think we need something like that - looks great the FarCry has it - they have the same basic look as Advanced Terrain - but theirs is obviously much better. (Talking about FarCry - not the latest title from UBIsoft.. just FarCry)

The whole thing about DBO's, Sigh's Idea etc - I'm totally for checking out what is better. Sounds like you're way into this stage already - and this sort of reminds of the ROAM techniques.

I definately like the "little" at a time technique - as this - like you said - spreads out the workload over time. I was thinkning though that any "final" method for rendering would probably be hard to make 100% COOL and Generic at the same time. Maybe not - and I'm not meaning to sound like a pessemist - its just my experience and research suggests one commonality: Terrain rendering is very GAME GENRE specific - FPS, Flight Sim, RPG, etc.

(Don't know how I'm going to pull off both flight sim and FPS in Iron Infantry - Geeesh - but I'll figure out some playable compromise!)

My head is still in the numbering clouds - and I'm wondering how hard it would be to load the data I'm setting up. Basically, there are 111.1949 km VERTICALLY between every latitude degree (11119meters vertically) But the Number across PER 10meter high row is going to vary based on the latitude - the farther from the equator you go - the lower the number of "Cells" Accross for a given row. This means - there may not be a nice "Square" data set - unless the BIN loader hops around a bit. This is the stuff I'm working on - and this is key - (I think) to make the GPS stuff work - by making it accurately "plotable" via long Lat.

I know I've been repeating myself a bit - but some of it honestly is so I can reflect about it - and also so if I'm doing something stupid - someone hopefully speaks up and says "NO MAN - TRY THIS!" or something.

Ok - Well - back to my charge... and good job on the demo.

Visigoth
19
Years of Service
User Offline
Joined: 8th Jan 2005
Location: Bakersfield, California
Posted: 4th Nov 2007 06:36
@Jason,
doh..
I sent you email, but see you have the answer. I did write the grid function to accept float values for the spacing, so this should be no problem for properly spacing the grids. I'm a little loopy right now, been trying to make a better color scale, and I think I might have it figured out.
Phaelax
DBPro Master
21
Years of Service
User Offline
Joined: 16th Apr 2003
Location: Metropia
Posted: 4th Nov 2007 07:19
Ran your first demo posted on page 1, chose whatever terrain #3 was and it took about 1-2min to load but had like 300-400+ fps.


Visigoth
19
Years of Service
User Offline
Joined: 8th Jan 2005
Location: Bakersfield, California
Posted: 4th Nov 2007 08:17
@Phaelax,
you must have a seriously fast pc. Awesome.
jason p sage
16
Years of Service
User Offline
Joined: 10th Jun 2007
Location: Ellington, CT USA
Posted: 4th Nov 2007 15:27 Edited at: 4th Nov 2007 17:44
@Phaelax and Visgoth - Regarding Phaelax's FPS - I'm envious!

Any thoughts on shaders for texturing?

Microsoft (for flight sim) did a program that took the satellite photos - (not radar height) and made a series of textures based on "Common" kinds of stuff - and then ran a process that looks at specific locations and decides "Which" textures "Fit the bill" for that area - then if its a LANDMARK area - they would use a doctored up REAL photo texture for that area - or something like that - this allowed them to "Quickly" texture the whole earth - and then zero in on specific areas they wanted to enhance. Roads I think are an overlay - as is waterways.


Ok - Storage - and floats - I added a VERY CRUDE picture to try to explain how the tiles are getting diced into different directories. I exagerated the longitude radial widths so you can see thier effect on the data.

the "Seam overlaps" of one pixel are not part of the "Storage" I figure this would begot during the loading and creating of actual meshes.

[edit] some of the wording in the pic is a bit rough too... What I was trying to convey is that there are 1km high HORIZONTAL ROWS of PIXELS/BIN data. Each individual PIXEL ROW WITHIN a ONE KM ROW has its own width. Basically the 1KM HORIZ ROW is a STRIPE. One STRIPE is 100 PIXELS (of 1meters each) TALL - but the NUMBER OF PIXELS ACROSS per HORIZONTAL PIXEL ROW - is based on longitude radial width. Basically - a 1 km high horizontal STRIPE could have a "triangle" appearance if it was exagerated.

[edit2]A note about the one KM Rows - - they are not EXACTLY a kilometer - BECAUSE One degree of latitude is not Exactly 112 km. Its more like .... Well - I'll post the comment from my source code - I THINK it explains it. Remember - with this deep a directory structure - you don't want a TON of files or DIRs in any one folder - so it's kind of arbitrary - but pretty darn accurate.



Also - the whole triangle thing described earlier has become simpler - but easier to load - if not a slower - because each PIXEL row - within a "STRIP" is its own file. This means each bin is just a horizontal rows of floats - One row, whose count is based on the longitude radial distance for the given latitude based on the haversine formula. Eat your heart out Google Indies are dangerous!

Attachments

Login to view attachments
Visigoth
19
Years of Service
User Offline
Joined: 8th Jan 2005
Location: Bakersfield, California
Posted: 4th Nov 2007 17:52
@Jason,
I get it, seems like a good method to do this. As for shaders, I thought about it a bit. I was actually hoping to use the other data available to help with this. Like the foliage coverage and street maps, etc. Some early shader tests with just simple diffuse lighting do show a significant performance hit when setting the effect on many tiles. Another idea I'm working out is "carving" in the roads. More on that later.
I fixed the color scale routine, so even Mohawk Valley terrain colors properly.
So far, these tiles load faster than either .x or .dbo. So, I'm convinced this can be done with only the use of the .bin files, unless...
we want to premodel some of the landmark areas.....hmm.....
jason p sage
16
Years of Service
User Offline
Joined: 10th Jun 2007
Location: Ellington, CT USA
Posted: 4th Nov 2007 19:30 Edited at: 4th Nov 2007 19:58
Quote: "I get it, seems like a good method to do this"

Cool - We only lose 4 meters per 111.19 kilometers this way. The width of narrow country road each full DEGREE Square. Though - due to how you build tiles - (which would need some tweaking to work with this thing I'm setting up) - it would in theory not even be noticable. In fact - I'm not positive - but here may even be a way to handle these! (By forcing the fractional piece of the "less than half" a "pixel" (10 meter tile) I may be able to "make way for it"... hmm... Yeah - I'll try this.


Quote: "I was actually hoping to use the other data available to help with this"
Me TOO Especially if I could get "Foliage Altitude" for forrests - because then I could make canopies based on satelite info - something I was planing on doing anyway - via a homemade tool if necessary! I really want the Paper-Rock-Scissors kind of thing working in Iron Infantry - Where the Canopy would even the odds - for both infantry and tanks against helicopters


Quote: "Some early shader tests with just simple diffuse lighting do show a significant performance hit when setting the effect on many tiles"
- I was afraid this might be the case. Besides - I'm personally not thrilled that FOG doesn't cover shaders - so if you use shaders - you then need FOG Shaders too - UGH - To BumpMap or Not to Bumpmap ugh... Bumpmapping has the same issue

Quote: "Another idea I'm working out is "carving" in the roads"
sounds neat - yeah there is Road data Available - and there was a writeup this issue of TGC creators mag that talks a little about SPLINE meshes for roads or something. Sounds neat. I wonder - would rivers work the same way? Global Terrain would mean that SEA LEVEL water won't do for rivers and lakes etc.


Quote: "So far, these tiles load faster than either .x or .dbo. So, I'm convinced this can be done with only the use of the .bin files, unless...
we want to premodel some of the landmark areas.....hmm..... "


In general - so far you have been loading VERY "perfect" "Squares" of data from single FLT/bin files. What I've been working out will mean - say a opening and reading quite a few little files - adding overhead - if they are not precompiled into objects (dbo/x). You know, precompiling MIGHT be best if only half "done". What? I mean what if you made and saved "MemBlocks" that are just lacking the Make Object from Memblock - or something to that effect? Then the processing is cut in half or something - I don't know - just an idea.

Quote: "I fixed the color scale routine, so even Mohawk Valley terrain colors properly."
EXCELLENT!

Yeah - I finally have this little "Storage" system pretty workable - though I'm going to give the half "tile" loss thing some though and see if I can't get some logic so "Half" tiles ALWAYS land on "Bottom" of Previous Degree/kmrow/pixelrow - so its pretty much lossless. Not bad now 4 meters lost out of every every 111,190! Not too too shabby. (DarkBasic DOES have Double Floats I hear - never used em - without em - this won't work!...at least without making it even more complicated.)

Ok - Well - while you're figuring out carving, I'm going to try to solve the 4 meter loss thing...

[edit] Well - I'm thinkning its pretty darn close, we are only talking about 4 meter horizontal lines at every FULL degree of latitude - and frankly - I'm not trying to land a space shuttle - and I'm not even convinced it will ever be noticeable. Granted only tests would show conclusively. Furthermore - I'm not convinced if its really even a "issue" that could be fixed - because of the rounding - I think these missed half tiles - will default to being the first row of the next row anyways.

With that in mind - I'm moving on to reading in satelite data - prefereably - 1/3second arc - but I'm thinking I should make it capable of also 1 second arc - so we can cover the planet.

I also think as data is encountered - the "Empty" files that they are to "occupy" should be filled with -9999 which means no data - and then use that as the decision whether or not to AVERAGE incoming Data. So - if you overwrite data for a given "Pixel" with the same info - it won't change - but it will "Average out" with overlaid data that differes. This might be key to reducing "SCAN SEAMS" as the USGS admits their data is skewed where it starts and stops - not much - but enough to make seams. Averaging might help this - and the REAL HOPE is that it makes 1/3second arc data and 1 second arc data "Meld" together nicer.

Finally - REAL Progress - the storage system is complete - Now to implement it - by actually putting GRIDFLOAT data in it!

calcyman
16
Years of Service
User Offline
Joined: 31st Aug 2007
Location: The Uncertainty Principle
Posted: 4th Nov 2007 20:34 Edited at: 4th Nov 2007 20:42
I don't like polar-coordinated spheres. I prefer coordinates based on an icosahedron, as that way you don't get inelegance like you do with the "globe" model of the planet.

By the way, to clear up distances:

Earth, if it was spherical, would be exactly 40,000km circumference (as a kilometer is 1/10000 of the distance between the north pole to the equator, through a place in France)

That means a degree is 111.111 kilometers.

1 arcminute is 1,851.(851) metres
1 arcsecond is 30.(864197530) metres

If you're okay with mapping over 500 million square kilometres, then go ahead and map the Earth. (Although most of the terrain is boring)

Your signature has been erased by a hyper-intelligent pan-dimensional being (a mod)
jason p sage
16
Years of Service
User Offline
Joined: 10th Jun 2007
Location: Ellington, CT USA
Posted: 4th Nov 2007 21:21 Edited at: 5th Nov 2007 01:13
Thank you CalcyMan.

My work thus far is based on Haversine's formula and reported to be accurate to within a few meters for distance measuring between any two lat/long coordinates - and it does this based on circumferance distances. I'm currently getting 111.1949266445587km per latitude degree.

I'm also aware of the elipsodal shape of the earth - but using Vicenty's formula - which is MUCH more complex - but [edit]DOES[/edit] handle the elipsodal nature of earth's shape quite specifically - is not all that much more accurate.

So - I'm thinking this will work pretty good - and "Dropping In" a different formula - if necessary - should be very doable - thanx to modular design

Concerning your comment - about
Quote: "mapping over 500 million square kilometres"
I couldn't agree more!

But - the beauty of this system I'm devising for storing the height info - is that it's only as complex (BIG) as the data you import into it. you can import 1/3ArcSec and 1ArcSec, and it automatically breaks up the source data into smaller files in the proper places for recall, dynamically adjusts the directory system on demand - as needed, and is a suitable system for mapping either Hawaii or the entire earth.

So - in short - One could use it to only map the really interesting terrain

Have you ever tried Ocean Maps? Google has them - I wonder what that format is like - hmmm.....


Calcy man - could you elaborate please as to why the icosahedron model? Is it because of 3d Rendering issues? Navigation and measuring? or what? Does it make the math easier somehow?


[edit] OK - So I have this algorythm for cataloging USGS data, and it appears to be pretty darn accurate - according to the math anyway - I still have a lot of testing to do do ascertain reliability. But frankly - cataloging isn't quite as fast as I had hoped. It is "Smart" in that is averages "overlapping" information, makes a nice binary mosaic with regards to radial distances at various latitudes etc - but it could use one thing - some octane. I didn't write the catloging to be fast - but I didn't think it would be this slow either! I think its due to the doubles - just costing alot in overhead. Dropping the Haversine function - just for a speed test - did make a difference - but I think just the double precision - well - there is a lot of it. Hmm.

I'm going to let my first "UPLOAD" to this database finish, and then I'm going to write some code to do the reverse - read the data out. That is the benchmark I'm worried about. Frankly - if the reads are just as bad - or even close to this slow - I'm going to having to think hard about stream lining this.

Is MUCH faster working with the Actual FLT files - but - I guess I would need an INDEXING system - so instead of trying to break up the data - I would instead just measure and catalog the data - then when a request is made - it takes the requested coordinates, and puts it together in memory as if it were "one bin file" the size requested. Hmm. This would need to be a smarter app - because one request could span many files - and Scale would need to be considered as well... Hmmm....

I'll wait to see what the read back is. Now its very convienant in that you can get it any set of coords and get the height. No Data? You get -9999 - (Even when there isn't data in there for the area of the world you asked for - makes it pretty easy to use. Hmm.

If I go with indexes - it will be harder to process overlapping data - where what I have now - its a no brainer - it just handles it. ... things that make you go hmmm... this is one of em.

Sigh
18
Years of Service
User Offline
Joined: 26th Dec 2005
Location: The Big 80s
Posted: 5th Nov 2007 06:59
@Jason

While your work on trying to be as accurate as possible for earth data from the USGS is commendable, I think you need to consider one thing - what will the widest use of this stuff be once all the code is complete?

Will it mostly be used to make game maps of earth, or fictional maps of other worlds/epochs?

If its the latter, there shouldnt be a need for super accurate operations like you are trying to clear up, as the world data would be generated by the creators of the game and would be automatically "lined up properly" (assuming the creators arent doing it half-ass).

The Great Nateholio
<img src="http://ixeelectronics.com/Nateholio/Pictures/Sigblock.PNG">
calcyman
16
Years of Service
User Offline
Joined: 31st Aug 2007
Location: The Uncertainty Principle
Posted: 5th Nov 2007 08:27 Edited at: 5th Nov 2007 08:28
An Icosahedron is a 20-triangle platonic solid. Instead of getting problems regarding fitting the terrain together, it works much better:

You divide the surface of the Earth into 20 equilateral triangles (all the same size) like the surface of an icosahedron.

This is more info about it:

http://en.wikipedia.org/wiki/Geodesic_grid

And I have derived a formula for finding the icosahedron coordinates:

Top Vertex: 90°N
Bottom Vertex: 90°S

Top row of vertices: ATAN(0.5)°N, 0,72,144,216,288 °E
Bottom row of vertices: ATAN(0.5)°S, 36,108,180,252,322 °E

To find in-between vertices, find the midpoint of the two vertices.

From this you can find which triangle the coordinates lie in. (You may want to offset the longitude to stop 360-0 transition errors)

Then find which sub-triangle the coordinates lie in. Repeatedly iterate this, and you will have converted polar coordinates into icosahedronal coordinates.

This would reduce anomalies at the poles, allow seemless transition, and allow you to use triangle-celled grids (which admittedly look more natural than square-celled grids)

We cannot use either DBPro matrices, or AT for this. We need to use the vertexdata commands.

By the way, each coordinate can be given by a triangle number, followed by a base-4 string specifying which sub-triangle the coordinates are in. Like this:

19,31230130213

19 - Tells us we are in main triangle 19

31230130213 - Tells us we are in the 3rd section (out of 4) of the main triangle, the 1st section of the smaller triangle, the 2nd section of the even smaller traingle... (giving increasing levels of detail, in both dimensions, on each iteration)



Your signature has been erased by a hyper-intelligent pan-dimensional being (a mod)

Attachments

Login to view attachments
jason p sage
16
Years of Service
User Offline
Joined: 10th Jun 2007
Location: Ellington, CT USA
Posted: 5th Nov 2007 13:25
@Calcyman - That is very interesting - and I see how that might be advantageous - and I agree about the seams as well. Thank you for the pics as well. I'm curious what others think and say about this as well.

@Sigh -
Quote: "While your work on trying to be as accurate as possible for earth data from the USGS is commendable, I think you need to consider one thing - what will the widest use of this stuff be once all the code is complete?"


Well - what I have been thinkning is that if I have a way to "Catalog" USGS data that's either 1/3arcsec or 1arcsec, handles "overlapping" downloads, and allows recall via lat/long coordinates - I would have a reliable "GPS Style" database system.

My main goal is to allow storing as much or as little as desired - and allow "Preprocessing" or "Picking what you want" out of this database - as you need. For example two tiles from USGS might be halves of what you want - and to get it in to one workable piece processing would be required. Well - this set up I have going allows asking for "user defined" tiles once the data has been cataloged.

The idea being one could grab whatever they need - from an individual "Pixel" of height data to a series of requests to pull out a custom made tile of whatever resolution is needed.

The idea or need surfaced when I saw how Visigoth's functions were designed for 1km tiles - but the USGS data doesn't readily come this way - and if a til didn't quite cover the area you wanted - you would need to do a lot of work to get Visigoths functions to be able to "Cross tile borders". This system TOTALLY allows this. You can download and "Catalog" a ton of tiles, all shapes sizes, etc. And then visigoth's functions could be tailored to load grid data from this "Database" for any area, any size desired - based on lat/long coords - that is what I've accomplished so far.

I'm not happy with the speed however. There is to much opening/closing/seeking within the files during this process.

I need to make a test to see how fast "Reading" from the database is next. If I can not implement a FAST READER - then I need to go back to the drawing board. If I CAN make a fast reader - then I just need to ooptimize the cataloging process - which scans a directory for USGS downloads, and processes the whole shebang.

There is alot of "Meta data" that comes with the downloads that I'm not currently using - like landmark info etc. Not a ton - but there is some (more stuff can be obtained by more downloads etc of different types - like roads, folliage - shoot - even yearly rainfall)

So if I hit the drawing board again - and devise a way to make a faster database - I'll see if I can't work in and keep the metadata available though a consistant interface.

I'm thinking bigger "chunks" of bin data, with index files that contain metadata about the file - like how many height pixels (10meter) for row 1, how many for row 2, etc. squares just don't work well... Something along these lines - also so information based on formulae don't come out with "Off" results when the code is ported to another language. Having certain information precalculated - makes a pretty fool proof way for the program to figure stuff out - less computation - more file IO however. I haven't figured this all out.

Any google searches for "How to I build my own global database that's fast using USGS Data" produces Zero Results!

calcyman
16
Years of Service
User Offline
Joined: 31st Aug 2007
Location: The Uncertainty Principle
Posted: 6th Nov 2007 21:25
@Jason p sage - I've just had an idea:

Why don't you let the users of the program contribute their own data, like photos, and then use them to create a more accurate construction of the Earth. You could also have it so when a user "walks" across the terrain, it optionally displays statistics such as:

Latitude/longitude
Average temperature
Average rainfall
Points of Interest in the area

People might prefer the heightmap data downloadable in image form as well. I think a bitmap could be loaded quickly, with each layer (Red, Green, Blue) corresponding to things such as height, texture etc.

Your signature has been erased by a hyper-intelligent pan-dimensional being (a mod)
jason p sage
16
Years of Service
User Offline
Joined: 10th Jun 2007
Location: Ellington, CT USA
Posted: 7th Nov 2007 13:25
[QUOTE]@Jason p sage - I've just had an idea[/QUOTE]

Go On....

Quote: "Why don't you let the users of the program contribute their own data, like photos, and then use them to create a more accurate construction of the Earth."
I have no idea how to go about this. I'm kinda perplexed now actually - and for my retreat have started playing "Fate" (First RPG I liked) as a bit of a diversion.

I was able to get an ACCURATE system worked out - but its awful slow - not the DIRECTORY storage structe - but the converting height "pixels" to long lat, plotting, then storing them. I've thought of the kinds of optimizations I could work out - but if that kind of stuff will be needed to "Read the data" as well - than this system would only be good for allowing "Custom datasets" to be downloaded - basically storing one mosaic of data, and returning any size tile you want from anywhere in the world - but not a speed demon - at least without a ton of various optimization "fuzzy" logic to rty to address each bottle neck - basically - converting doubles to the whole number portions and mantissa - at various stages in the claculations - seems to be one part of lag disease - the other is how I'm currently opening closing files to much - to get to the "Pixel" I'm plotting in the routine. Logic that could look ahead would help some - (avoid repeatitive opens of same files) but even this won't be enough - as there a lot of little 40k files for each "pixel row" (width of long radial distance apart at given longitude. What I have is solid - just not fast.

Would the users require an interface to "Scale" the images somehow onto the terrain? (to beable to add their own) I could see this working - for Arial shots - otherwise you got me - I'm not sure where to begin.

Quote: "
You could also have it so when a user "walks" across the terrain, it optionally displays statistics such as:

Latitude/longitude
Average temperature
Average rainfall
Points of Interest in the area
"


This sounds like an application all its own - and is possible - but my goal is to get back track for game development sooner than later. This would be a cool application - shoot - would be a neat educational program - the USGS does have all the kinds of information you metioned - plus I think population etc. Not to mention roads and stuff.

Quote: "People might prefer the heightmap data downloadable in image form as well"
Agreed.

Visigoth and I have talked a little about converting the data to heightmaps - my motivation was so the data is more flexible - Want Advanced terrains? Want to import the Heightmap into a Terrain Editor? etc. Oh yeah - this makes a lot of sense. I agree about the bitmap loading - though in a side note I've been pondering how DBPro uses vid ram for alot fo this stuff - and this makes me think striahgt up memory maps - (even bitmaps manhandled into a memory location other than the video card) is a good option - if the data is not directly going to be rendered somehow by the video card.

Visigoth
19
Years of Service
User Offline
Joined: 8th Jan 2005
Location: Bakersfield, California
Posted: 8th Nov 2007 03:19 Edited at: 8th Nov 2007 03:21
@Jason
Funny, you keep bringing things up I'm halfway done with
I've been experimenting with bitmaps, and how I can utilize them to "carve" the roads into the terrain. And, I have a way to do it. And, its not so hard to do. But now, I want to use the same method to populate the terrain with different items, like trees, bushes, buildings, whatever.
I also want to figure out a way to make the terrain tiles FASTER. I've been peeking into Cloggy's source for how he accessed D3D. We need to be able to make objects using triangle STRIPS, or maybe just points, but not using triangle LISTS.(long story) I think I'm going to have to make a .dll to allow us to do this, and I think (I hope) the speed increase will be WAY worth it.
I've been thinking about where I am at so far, and is there any useful application at the moment. I think so far, I have a decent 3D terrain visualization concept, and very close to a decent 3D terrain visualization application. So, with that in mind, I think I'm going to focus, for the time being, on putting something together that I can present to the USGS as terrain visualization software for the NED data, and maybe they'll add it to their list. Might be useful for education or other purposes. But, my real goal is a system for gaming and simulations.
So, just letting you know where I am at. Hopefully another demo out in the next couple of days.

@calcyman
That just makes my head hurt.
jason p sage
16
Years of Service
User Offline
Joined: 10th Jun 2007
Location: Ellington, CT USA
Posted: 9th Nov 2007 05:47
Quote: "Funny, you keep bringing things up I'm halfway done with"
Ah.. Well.. Find that kind of useful as a "validation" of findings etc. This is a good thing.

Quote: "I also want to figure out a way to make the terrain tiles FASTER. "
I hear ya - I have a storage/recall system now that is to the 10meter "pixel" acurate - but just to freaking slow. Though I think it COULD possibly be viable as a processing step - so you load it up - define the TILE "Size" and "Resolution" mosaics you want - and then let it chunk it out of the "database" to "Tiles" made to "Meld" together - for the game/presentation load/mesh/display processs.

Quote: "We need to be able to make objects using triangle STRIPS, or maybe just points, but not using triangle LISTS.(long story)"
hahha... not that long a story - I think you mean - simply put "Recycle Vertices" as efficiently as possible so you don't need to pump as much data into the video card.

Honestly I've been scoping out other engines that allow this sort of thing with less effort - but the jury is still out.

Thank You for the update Visigoth - I haven't given up on this - so I really appreciate you letting me know your findings/progress etc as well.

It's late - Must .. s . l .e . e . p

Visigoth
19
Years of Service
User Offline
Joined: 8th Jan 2005
Location: Bakersfield, California
Posted: 29th Nov 2007 05:30 Edited at: 29th Nov 2007 06:10
well,
I've been busy.
Something that I decided I HAD to do, was add some texturing to the terrains. Some folks I have been showing my work too were not impressed, because there was no trees, or buildings, or even grass.
I tried to explain, that that is like a "paint job". The real beauty of this system is the fact it can generate terrains with little to no effort.
I decided to finally spend some time on "painting".
I wanted a method to place objects, like trees and buildings and rocks onto the terrain without too much trouble. I also wanted a method to "carve" a road into the terrain.
Well, it turns out, the method I came up with solves those problems, as well as LOD texturing.
I now have the ability to easily texture a terrain tile with a high res image, and also use that image in lower res to set the diffuse colors of the vertices for far away terrain tiles. So, when you are on a terrain tile, you get a high res image textured onto the object, but the far away tiles use only diffuse vertex coloring to simulate texturing. Is it really necessary? Maybe, because it eliminates the need for the app to create mipmaps. And that is just a little bit more memory saved.
As for the lighting; this is the best part. Using the default directional light, shadow detail of the terrain can be done in real time, so, it is possible to simulate the sun moving across the terrain, or even clouds blocking the sun.
At this point, I am going to start working on models for basic terrain objects, like rocks, grass, and trees. But, if all you need is a REAL terrain based on REAL geography, and you can place your own detail, then I think I can almost call this done.
Anyway, just screenshots for now, I hope to post a demo in the next day or two, depending on the work schedule. This image is a screenshot of a vertex shaded terrain; there is no texture applied.

Attachments

Login to view attachments
Visigoth
19
Years of Service
User Offline
Joined: 8th Jan 2005
Location: Bakersfield, California
Posted: 29th Nov 2007 05:40
and here is textured tile, with a 512x512 texture, and the outlying tiles have no textures, just vertex coloring.

Attachments

Login to view attachments
Visigoth
19
Years of Service
User Offline
Joined: 8th Jan 2005
Location: Bakersfield, California
Posted: 29th Nov 2007 05:45
and one more, that shows the difference between the vertex colored and the textured tiles.

Attachments

Login to view attachments

Login to post a reply

Server time is: 2024-05-30 22:05:15
Your offset time is: 2024-05-30 22:05:15