Jump to content
RockWare Support Forum


  • Content Count

  • Joined

  • Last visited

Community Reputation

0 Neutral

About GregVDS

  • Rank
  1. The ATD way is a good thing, but I already discovered XYZ calculation is awfully slow when dealing with 1000s of boreholes. Anyway, I'll try it with small database. The xls way I don't quite catch. Do you go and edit the MDB file directly in the back of RW? If possible, I would write a python script to deal with this, and have the enabled/disabled status dumped in a pickle file. By the way, Python would be a must to code and work into RW. All the best, Greg
  2. Thanks Alison, this can help in some situations, but here, I'm with a borehole database with >3000 boreholes, the majority being enabled, and some, erraticaly, being disabled for they have problems, either of coordinates or of interpretations, but I don't want to loose them either. So, I got a subset impossible to reproduce by any SQL statement. If I operate whatever new subsetting, based on name or whatever other criteria, I'll loose this first subset, which is mandatory for my work. Hence the question, is it a way to save enabled/disabled characteristic for each borehole on the entire database to reapply it later. Apparently not, and this could really help. Thanks again, Greg
  3. Hi, I would like to know if there is a way to quickly record a composition of enabled/disabled cores? My need is to change the status of huge cores sets at once (I'm dealing with thousands of cores and testholes). I would sometimes be pleased to be able to deselect all my testholes (only used for chemistry intervals for isopachs maps), or only show the cores based on their names, then come back to my former selection (some misplaced I discarded, or problematic ones, stuff like that). So, basically, when you have enabled/disabled some of your cores, you would save that criteria only, changing it, record again, and then later, reapply the first selection set, and so on. This again could really help to work out the dataset. All the best, Greg
  4. Hi everybody, Usually, once you got all your cores put into rockworks, you begin a long work mainly making cross sections through your datasets, with some chemical intervals displayed, maybe some lithologies already put (I made some nice Python scripts to deal with ascii export of my database, tagging automatically all my cores based on chemical criteria). The basic stuff here is to have stratigraphy put on all cores, so as to be able to create a stratigraphic model. Cross section is ok, and one can have also some distance buffering along the cross sections, to have more cores displayed on it. I was thinking about a good new function, I called it neighborhood comparison. You got a core selected, and you just ask for neighborhood comparison (X Y based). One would have to choose a distance range, in X Y units, or percent of the project dimension, whatever. You would have automatically all the boreholes in the close vicinity of the one selected displayed on their correct elevation, to have a quick view of the other ones, having not to draw a composite section each time around the core. This could be helpful to find good local correlation, or finding misplaced stratigraphic limits. this could really be helpful. What do you think? All the best, Greg
  5. Thanks Tom, That's the solution I had in mind: creating virtual boreholes to be able to enter contact, at the surface, or even at a given depth. Thanks for the info for faults, that's a valuable one, I was fearing there was no way to do that, and fractures were not the solution. Many thanks for the info. Best Regards, Greg
  6. Actually, there is a hard limit X by Y wise. I'm doing stratigraphic modelling on huge quarries, around 3 by 7 miles. I'm covering this at the moment with around 601 x 321 nodes, doing 192.921 nodes horizontally. 'Zwize', I got around 150 nodes, doing 30.095.676 voxels. That is huge, but the pixels are 'only' 50 feet by 50 feet, and sometimes, to predict chemistry of quarry blast, I would like to have better resolution, but you got a million nodes horizontal hard limit. I was wondering, besides the fact to make several different models around the quarry, if there was a way I could have a better horizontal resolution than 1M values. Thanks anyway for your input. Best regards,
  7. Hi, I would like to know if there is any way I could take into account, or constrain stratigraphy model by geological field data, contacts, dip and strike info, things like that. I would also like to know if I can impose a fault, having the model taking that into account also. Many thanks again, Greg
  8. Hi, Is it possible to make a bigger model than the 1 million nodes limit? Many thanks, Greg
  9. Ok, Thank you for the fast support, now I'm waiting the last unlocking code, and it will be again up and running. How big can be the files I would send you? my grd file is around 4,3Mb. Thanks again, Greg
  10. Hi, I got a dxf containing contour topo lines. I imported it in GlobalMapper10. There, I interpolated an elevation model, that I saved into Rockworks grid file. This one, I loaded simply into my project. Statistics about the grid are ok, but whenever I try to display it simply in 2D, I got Floating zero division error. Then I tried to export from GM10 xyz data, that I tried to import into a spreadsheet. An attempt didn't worked, and the spreadsheet dimensions stuff fields were tagged in red. I closed Rockworks. After that, I got bugs at startup of RW14 telling me some tables was corrupted or missing, then it opened but had lost my project, then it was impossible to reopen it. So I unistalled RW14, and reinstalled it, but the licensing tool tells me now an error61 with HKLM license corrupted. I emailed [email protected], even completed a re-install formular on the website. I have no answers from any part, please help me! Thanks, Greg
  11. Hi, I discovered something: When one extracts I-data, filtering them between _TOP and _BOT grids, I see that I have one interval lacking, the one touching the _BOT grid, and one too much, the one on top of the _TOP grid... Is this normal? Can I change that behaviour somewhere on the menus of the extraction? I think it's because of this I have wrong average values. Many thanks again for the help. Greg
  12. Hi, I make statistics maps of I-data. I want to have a map of the average SiO2 content of a given stratigraphic unit. I make a stratigraphic model, so I have ma unit _TOP and _BOT grid. I XYZ constraint the Z average of SiO2 I-Data with the grids. I obtain a map. I check some low and high values shown. These are not coherent with the one I see in the boreholes I-Data... I also correct some stratigraphical definitions in boreholes, changing depth of my unit top or bottom. Then reproduce the stratigraphical model, remake the map... Not the slightest change. What should I do?? Many thanks to help on this. Greg
  13. Hello, I try to use the I-Data extract feature, but it does not seem to work. I constraint the extraction with the TOP and BOT grid of a stratigraphical unit I have, expecting to have all the intervals in this stratigraphic unit listed. I finish with an empty table. What should I check? Many thanks, Greg
  14. Perfect! Many thanks Molly, this more than fills the bill. I manage to produce maps, and that's what I was asked. Now to know how to produce report and numerical values is a nice thing to know too. All the best, Greg
  15. Hello, I have a set of boreholes, for each of which I have intervals containing chemistry, and stratigraphy defined also. Now I would like to compute a chemical average for each borehole and each stratigraphy, based on the intervals chemistry contained inside each stratigraphical units. Is it possible automatically? Many thanks, Gregoire Vandenschrick
  • Create New...