Core Costs

Lately, we’ve been experimenting with multi-core programming. This is a technique whereby tasks are split up among multiple processors in order to decrease processing time.

To appreciate the advantages of multi-core programs, and the number of cores available, consider this example:

  • We recently calculated that a big, ugly, multivariate probability block model would require 167 days to run using a moderately high-end laptop (ASUS G75VW – quad-core i7-3820QM). This machine processes data at a rate of about 70 gigaFLOPS (FLOP = Floating Operations Per Second). That’s 70,000,000,000 operations per second.

167 Days

 

  • For the sake of comparison, the new Cray XC40 is capable of running at 100 petaFLOPS. That’s 100,000,000,000,000,000 operations per second or 14,286x faster than the ASUS laptop. The same modeling process would therefore require about 17 minutes on the Cray.

17 Minutes

The ASUS G75VW costs about $1,599 and weighs 8.7 pounds.

The Cray XC40 costs about $156,000,000 and weighs 140 tons. [Note: Price is for 16petaFLOP/480,000-core version. Unable to obtain price for 100petaFLOP/1,000,000-core configuration.]

The following table provides some cost-per-processor examples:

Computer

Price ($)

Processors

$ Per Processor

Lenovo Thinkpad X140e

279

4

70

Lenovo ThinkServer TD340

906

6

151

Adamant AMD FX 8350

953

8

119

Apple Mac Pro

9,499

12

791

HP Z820

6,599

12

549

Mediaworkstations i-X2

14,424

24

60

Cray XC40

156,000,000

480,000

325

Note: This comparison does not take clock speed into account.

 

Conclusion: The current generation of quad-core i7 and AMD processors represent the best bang-per-buck with the very notable exception of the Mediaworkstations i-X2.

Mediaworkstations i-X2: $14,414 w/2 12-Core Intel Xeon E5-2697 (24 Cores)

Postscript: This blog entry was written while waiting for a model to be interpolated on a quad-core machine and fantasizing about more cores to multi-thread.

 

 

 

New Block Model Control Point Declustering – Coming Soon To RockWorks16

(035) New Feature (11/16/14/JPR): The Decluster option within the block-modeling dialog has been completely redesigned.  Specifically, the program now uses a fast “octree” algorithm to remove duplicate points and to decluster clustered points.  This method essentially creates a temporary block model in which a parallelopiped enclosing the control points is recursively subdivided into octants that contain control points until they are smaller than the specified maximum voxel voxel size.  The program then looks for multiple occurances of control points within each of these voxels and replaces these control points with a single point based on the user-specified declustering method.


The declustering methods are described as follows:

  • Average: The average G-value for all points within a voxel.
  • Closest Point: The G-value for the point that is closest to the voxel midpoint.  Recommended for modeling color and lithology.
  • Distance Weighted: The estimated G-value based on an inverse-distance-squared weighting algorithm.  Recommended for modeling most data sets except for color and lithology.
  • Highest:  The highest G-value for all points that reside within a voxel.
  • Lowest:  The lowest G-Value for all points that reside within a voxel.

The Horizontal Resolution defines the declustering voxel x-size and y-size as a function of the specified project dimensions.  For example, if the Horizontal Resolution is set to 50% (the default) and the x-spacing for the project model is 100′, the horizontal size of a declustering voxel will be 50′.

The Vertical Resolution defines the declustering voxel height as a function of the specified project dimensions.  For example, if the Vertical Resolution is set to 50% (the default) and the z-spacing for the project model is 2 meters, the vertical size of a declustering voxel will be 1 meter.

The Show Report option will display a dialog box (shown below) that summarizes how many points were consolidated via the declustering process.


In addition, this dialog provides an option to copy the declustered points to the RockWorks Utilities Datasheet (see below).


This data may be plotted in 3D by using the Utilities / Map / 3D-Points program (see below) to examine the effects of various declustering methods and resolution settings.


Additional Notes:

  • As shown by the examples above, the declustering can speed up the processing by more than 50% (half the time!).
  • Creating declustering voxels that are more than 50% of the project model voxel dimensions is not effective because we’re beginning to encounter a “point of diminished return” when the declustering is spending more time consolidating the points that reside within the declustering voxels.  That’s why we don’t recommend declustering resolutions greater than 50% – you’re just losing accuracy and there’s no speed benefit.
  • Declustering is turned on by default and set to the Closest Point method with a horizontal resolution of 50% and a vertical resolution of 50%.
  • Although the dimensions of the temporary voxels are based the project dimensions node spacing, the octree model extents may extend the project dimensions in order to accomodate control points that reside outside the project dimensions.
  • In addition to handling clustered points, the declustering will eliminate any duplicate points that are passed to the modeling algorithm.  This is important because some of the modeling algorithms handle duplicate points poorly (i.e. producing divide-by-zero error messages).
  • Data sets that are uniformly distributed do not gain a speed benefit from declustering.  In fact, the declustering actually slows down the processing.  For example, a data set with 50,000 randomly distributed control points (see below) required 330 seconds without declustering and 340 seconds (3% slower) with declustering.


Fixing Unreadable RockWorks Menus

If you ever encounter unreadable RockWorks menus with dark filled rectangles hiding the underlying text (see example below) …

… you’re probably running an older version of Kaspersky Internet Security that apparently garbles the Windows theme/style settings.

Fortunately, the solution is very simple: Update your Kaspersky software to the latest version and the problem will go away.

Special thanks to Rudy Abo at TU Freiberg and Rafael Maricca at LMD Innovative.

A Strategy for Modeling Lithology within Faulted & Subsiding Basins Using RockWorks16

This case study involves a faulted and subsiding basin in which the faults do not extend above younger sediments. To model this geology, five data sets were used:

  1. downhole lithology logs (Figure 1),
  2. a surface topography model (not shown),
  3. two fault “ribbons” (not shown),
  4. a reference surface based on a gravity survey – for “warping” the interpolations into the basin (Figure 4), and
  5. an unconformity surface (Figure 7) that defines the contact between the younger, unfaulted sediments and the older, faulted units.

The younger and older sediments were independently modeled, using the unconformity surface as the common boundary and then combined into a final model (Figure 12). The sediments above the unconformity were modeled without faulting or warping (Figure 8). The sediments below the unconformity were modeled with faulting and warping (Figure 10). Finally, the upper and lower models were combined (Figure 12) to create a model in which the upper units are relatively flat-lying and unfaulted while the lower units effectively subside into the basin.

Index to Diagrams

  1. Boreholes from which models were generated.
  2. Lithologic model without faulting or warping.
  3. Lithologic model with faulting but without warping.
  4. Reference surface used for model warping.
  5. Lithologic model using warping but without faulting.
  6. Lithologic model using warping and faulting.
  7. Unconformity surface representing contact between lower, faulted geology and younger, unfaulted geology.
  8. Lithologic model above unconformity. Neither warping nor faulting were used.
  9. Lithologic model below unconformity using both warping but not faulting.
  10. Lithologic model below unconformity using both warping and faulting.
  11. Combined lithologic models below (with warping but without faulting) and above (no warping or faulting) unconformity.
  12. Combined lithologic models below (with warping and faulting) and above (no warping or faulting) unconformity.

All modeling was performed with the lateral extrusion algorithm.