0% found this document useful (0 votes)
156 views57 pages

Reverse Engineering of 3-D Point Cloud Into NURBS Geometry

This thesis outlines a novel method to reverse engineer a 3D point cloud directly into CAD geometry without converting the point cloud into an STL mesh. The method involves slicing the point cloud into layers, extracting curve profiles from each layer using computational geometry techniques, and then extruding or lofting the curves to form a solid CAD model. Four case studies applying this methodology to a Stanford bunny, human hand, turbine blade, and table demonstrate consistent success in capturing the objects' shapes. Validation of the reconstructed CAD models found that reducing the slice thickness and number of layers improves accuracy but is limited by point cloud density.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
156 views57 pages

Reverse Engineering of 3-D Point Cloud Into NURBS Geometry

This thesis outlines a novel method to reverse engineer a 3D point cloud directly into CAD geometry without converting the point cloud into an STL mesh. The method involves slicing the point cloud into layers, extracting curve profiles from each layer using computational geometry techniques, and then extruding or lofting the curves to form a solid CAD model. Four case studies applying this methodology to a Stanford bunny, human hand, turbine blade, and table demonstrate consistent success in capturing the objects' shapes. Validation of the reconstructed CAD models found that reducing the slice thickness and number of layers improves accuracy but is limited by point cloud density.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 57

Reverse Engineering of 3D Point Cloud into NURBS Geometry

A thesis submitted to the

Graduate School of the University of Cincinnati

in partial fulfillment of the requirements for the Degree of

Master of Science

in the Department of Mechanical and Materials Engineering

Of the College of Engineering & Applied Sciences (CEAS)

By

Shriyanka Joshi

Bachelor of Technology in Mechanical Engineering

Jawaharlal Nehru Technological University, Hyderabad, India - 2013

Committee Chair: Dr. Sam Anand

1
ABSTRACT

In the manufacturing domain, product design is the blueprint of the part. The product

design is stored and represented as a solid geometric model. Sometimes, we need to bring an

already manufactured part into the digital domain and re-create the blueprint for engineering

design purposes. Objects are typically scanned using contact or non-contact type scanners to obtain

a point cloud. The point cloud carries information about the part’s surface as coordinate data in

3D space.

This thesis outlines a novel method to reverse engineer a point cloud directly into CAD

geometry, without the necessity of converting the point cloud into a 3-D mesh format such as STL.

This approach is inspired by the process of layer by layer material deposition in additive

manufacturing. The point cloud is processed to obtain slices of points with a uniform slice

thickness. In the next step, closed B-spline curves are interpolated using the points of each slice

resulting in layer-wise curve profiles. These curves are either extruded or lofted in the CAD

environment from one slice to the next to obtain a solid CAD model. Concepts of Computational

Geometry and Image Processing are used in this approach.

Four case studies are performed to demonstrate the methodology. The results show

consistent success in capturing the near net shape of the objects. An improvement in the accuracy

of the final geometry can be observed upon reducing the slice thickness. However, the minimum

slice thickness is limited by the density of the point cloud.

2
3
ACKNOWLEDGMENT

I am grateful to my advisor Dr. Sam Anand for his guidance throughout my graduate school

journey. The academic, career, and moral support I received from Dr. Anand from time to time,

helped me grow and smoothly transition into a professional.

I am also thankful to Dr. Manish Kumar and Dr. Jing Shi for graciously accepting my

request to be a part of my thesis committee.

I am thankful to my lab mates, Omkar Ghalsassi, Botao Zhang, Vysakh Venugopal,

Matthew McConaha, Lun Li, Sourabh Deshpande, Nate, Srikanth Reddy Pydala for brainstorming

sessions on various topics, keeping up geeky banter and a generally fun atmosphere in the lab.

I dedicate this thesis to my family, my mother Manjusha Joshi, who always has my back;

my brother Shriyash Joshi, for keeping up with my shenanigans and especially my late father

Dileep Joshi, who would have been proud of me.

I thank my dear friends, Bala, Abhishek, Shivangi, Arun, Likith, Sarang, Udeet, Viraj, Bharat, and

Parth, who have stuck by me through thick and thin.

4
TABLE OF CONTENTS

ABSTRACT...................................................................................................................................................... 2
ACKNOWLEDGMENT ..................................................................................................................................... 4
TABLE OF CONTENTS..................................................................................................................................... 5
LIST OF FIGURES ............................................................................................................................................ 7
LIST OF TABLES .............................................................................................................................................. 9
1 INTRODUCTION ........................................................................................................................................ 10
1.1 BACKGROUND ....................................................................................................................................... 10
1.2 MOTIVATION OF RESEARCH.................................................................................................................. 11
1.3 OBJECTIVE AND IMPACT OF RESEARCH ................................................................................................ 12
1.4 THESIS OUTLINE .................................................................................................................................... 12
2 LITERATURE REVIEW ................................................................................................................................ 13
2.1 POINT CLOUD PROCESSING AND MESHING ......................................................................................... 14
2.2 SURFACE RECONSTRUCTION................................................................................................................. 14
2.3 CURVE RECONSTRUCTION FROM UNORGANIZED POINTS ................................................................... 16
3 METHODOLOGY ....................................................................................................................................... 17
3.1 3-D SCAN DATA POINTS ........................................................................................................................ 18
3.2 SLICING AND PROJECTION .................................................................................................................... 19
3.2.1 ALPHA SHAPES ................................................................................................................................... 21
3.3 IMAGE PROCESSING .............................................................................................................................. 23
3.4 IMAGE AND SCAN DATA SUPER-IMPOSITION....................................................................................... 26
3.5 CURVE DATA EXTRACTION .................................................................................................................... 30
3.5.1 STENCIL GENERATION ........................................................................................................................ 31
3.5.2 ALPHA-SHAPE AND SPLINE INTERPOLATION ..................................................................................... 32
3.6 GENERATION OF SOLID CAD BODY ....................................................................................................... 34
3.6.1 EXTRUSION ......................................................................................................................................... 34
4 CASE STUDIES AND RESULTS ............................................................................................................... 36
4.1 STANFORD BUNNY .......................................................................................................................... 36
4.1.1 LOFTING ...................................................................................................................................... 37
4.2 HUMAN HAND ...................................................................................................................................... 38
4.3 TURBINE BLADE ..................................................................................................................................... 39

5
4.4 TABLE .................................................................................................................................................... 40
5 VALIDATION OF CAD GEOMETRY ............................................................................................................. 44
5.1 STANFORD BUNNY – VALIDATION ........................................................................................................ 44
5.2 HUMAN HAND - VALIDATION ............................................................................................................... 46
5.3 TABLE - VALIDATION ............................................................................................................................. 47
5.4 RESULTS – DISCUSSION ......................................................................................................................... 48
5.4.1 OPTIMUM SLICE THICKNESS FOR ACCURACY OF FINAL GEOMETRY ................................................. 48
5.4.2 IMPACT OF NUMBER OF LAYERS ON ACCURACY ............................................................................... 49
5.4.3 ADAPTIVE SLICING SCHEME ............................................................................................................... 49
5.4.4 IMPACT OF BUILD DIRECTION ON ACCURACY OF FINAL GEOMETRY ................................................ 49
6 CONCLUSIONS AND FUTURE SCOPE ........................................................................................................ 53
7 REFERENCES ............................................................................................................................................. 55

6
LIST OF FIGURES

Fig 1 General workflow of Reverse engineering of a product

Fig 2 Point Cloud → STL mesh → B-Spline surface patches

Fig 3 Flowchart of the steps in the methodology of this research

Fig 4 Slicing direction, the orientation of slicing planes and slice thickness of the bunny

Fig 5 Slice points and the projection of slice points on to the bottom slicing plane

Fig 6 Sliced and projected scan data points of the bunny

Fig 7 Alpha-shapes by the union of disks R = 1/α

Fig 8 Alpha shapes without point cluster separation

Fig 9 Image of Front View of the bunny

Fig 10 Grayscale image

Fig 11 Binary image

Fig 12 Flow-chart of the silhouette processing algorithm

Fig 13 Enlarged view of a single slice of the image of the bunny along with silhouette points

Fig 14 Silhouette data and scan data plotted together

Fig 15 Co-ordinate systems

Fig 16 Rotation of silhouette

Fig 17 Translated silhouette and scan data to the origin

Fig 18 Superimposed scan and silhouette points

Fig 19 Enlarged view of a single slice of the super-imposed scan and silhouette data

Fig 20 Stencil generation

Fig 21 Alpha shapes and B-spline interpolation

7
Fig 22 Stencil and sliced scan data

Fig 23 Extruded Cylinder

Fig 24 CAD Bunny

Fig 25 The Stanford bunny case study with a smaller slice thickness

Fig 26 Lofted surface

Fig 27 Lofted Bunny

Fig 28 Lofted Human hand

Fig 29 Extruded Human hand

Fig 30 Turbine blade

Fig 31 4-Legged table

Fig 32 Resultant curves obtained from the front and side-view stencils of the table

Fig 33 Enlarged view of the stencil of table legs in the front view

Fig 34 Enlarged view of the stencil of table legs in the side view

Fig 35 Extruded CAD geometry of the table

Fig 36 Bounding box - bunny

Fig 37 Point registration for geometries obtained using different slice thicknesses for bunny

Fig 38 Bounding box – hand

Fig 39 Point registration for extruded hand geometry

Fig 40 Bounding box – table

Fig 41 Point registration for extruded table geometry

Fig 42 Slicing schemes and point cloud registration of the bunny

Fig 43 Slicing schemes and point registration of the hand

8
LIST OF TABLES

Table 1 Stanford Bunny case 1

Table 2 Stanford Bunny case 2

Table 3 Human hand

Table 4 Turbine Blade

Table 5 Four-Legged table

Table 6 Root mean squared distance error calculations for the bunny

Table 7 RMSE of the human hand

Table 8 RMSE of the table

Table 9 RMSE for different build orientations of the bunny

Table 10 RMSE for different build orientations of the hand

9
1 INTRODUCTION

1.1 BACKGROUND

Before a product is physically manufactured, its blueprint is designed digitally on a Solid

Modelling System and a CAD software is used to represent a 3-D virtual model of the part that is

to be manufactured. The processes of transfer of product design data to manufacturing are well

established, and digital design data in neutral or native CAD file formats allow seamless transfer

of information during the entire Product Design Lifecycle.

However, obtaining a physically manufactured part back into its original digital design is

often required and is known as the process of reverse engineering a part model. A digital model

may be needed to recover a true 3-D solid model of the part, modify the original part design for

new functionality, determine a new manufacturing method, or repair a broken part.

To recover digital data from a physical part, contact and non-contact scanners may be used.

The scan data is represented as point clouds in 3D space. Point clouds do not contain surface

information or design intent. They must be converted into a usable format for CAD, CAM, and

CAE purposes. Fig 1 shows a general workflow of the reverse engineering of a product.

This research work proposes an innovative approach for reverse engineering of scan data

into a CAD model based on additive manufacturing steps and principles.

10
Fig 1 General workflow of Reverse engineering of a product

1.2 MOTIVATION OF RESEARCH

Most available commercial software converts a point cloud into a 3-D mesh/STL [24]. A

mesh representation is useful for Additive Manufacturing for ease of performing slicing

operations. However, tedious manual processes are involved in editing a mesh to remove noise,

repair, and convert the mesh into a watertight object for 3D printing. Modifying a mesh is not easy

because the STL format by its nature was not designed to be edited. Editing a design is fairly

simple in native CAD platforms, and the edited design may then be exported in relevant file

formats to the AM/CAM machines or CAE purposes. However, unless the part design can be

represented using only a combination of primitive surfaces and solids, tools are not available to

easily reverse engineer an organic-looking part mesh into a CAD model. The motivation for this

research was to explore alternate methods to simplify the difficult process of reverse engineering

a 3-D point cloud to a NURBS CAD model. This methodology mitigates the conversion of point

cloud into a 3D mesh and simplifies the reverse engineering process. This approach applies to

parts with both regular and organic geometries.

11
1.3 OBJECTIVE AND IMPACT OF RESEARCH

This thesis outlines a novel approach for reverse engineering of point cloud data directly

into usable CAD geometry. A combination of algorithms and methods drawn from Computational

Geometry, Image Processing, and Additive Manufacturing principles are employed in coming up

with the conversion methodology that is subsequently applied to four case studies to demonstrate

the proof of concept.

This research points towards the issues with the recreation of a part digitally using existing

methodologies, and the prior reliance on obtaining a mesh-based representation of a part. This

thesis addresses the reverse engineering problem using a novel approach for direct conversion of

point cloud data to NURBS based CAD geometry. Although there is much scope for

improvisation, this approach opens new possibilities for automation and simplification of the

process of reverse engineering.

1.4 THESIS OUTLINE

The introduction chapter of this thesis briefly describes the reverse engineering processes,

the motivation behind this research, its objectives, and its impact. A literature review of the past

research efforts in the field of reverse engineering is presented in the next chapter. The third

chapter provides a detailed explanation of the methodologies and algorithms. Test cases that

validate the proposed methodologies are demonstrated in the fourth chapter. Validation of the test

cases is presented in the fifth chapter. In the sixth chapter, conclusions, and future scope of this

research are discussed. The references are provided at the end of the thesis.

12
2 LITERATURE REVIEW

This chapter provides a literature review of various methods of reverse engineering,

approaches to point cloud processing and meshing, surface reconstruction and curve

reconstruction of unorganized point clouds.

Puntambekar et al. [25] provide a unified review of reverse engineering techniques, with

specific emphasis on the generation of a 3-D digital model from point data obtained from

various data collection methods. Existing commercial software methods of reverse engineering

a 3D scan point cloud into a CAD file format involves the conversion of the point cloud into a

triangular mesh (STL), then using the mesh to identify regular features such as lines, arcs,

planes, axes of symmetries [24]. These features are used in the CAD environment to generate

a sketch and subsequently into a solid. This approach works for symmetric objects. An

approach to reverse engineer an organically shaped object as shown in Fig 2, involves the

processing of STL mesh and segmenting the mesh into regions. A surface patch is stitched

over such a region. The individual patches are then stitched together to form a solid body.

These processes are difficult because considerable processing goes into obtaining a usable

mesh from a point cloud. The point cloud data itself needs to be processed first to remove

inherent noise due to scanning processes.

Fig 2 a) Point Cloud [10] b) STL mesh (rendered in Meshlab) c) B-Spline surface [7]

13
Previous research on the topic of reverse engineering of point clouds is summarized in the next

few sections.

2.1 POINT CLOUD PROCESSING AND MESHING

Varady et al. [1] provide a review of reverse engineering methods, algorithmic steps, and

different strategies of reconstruction of geometric models. The issues with data collection methods

and output surface representations are discussed. Lee et al. [2] describe a method of generation of

STL files from point data by segmentation and Delaunay triangulation. Schemenauer et al. [3]

reverse-engineered a point cloud using a point cloud registration method to integrate the point data

of an object from various angles. They then used a voxel-based method for meshing the point cloud

and obtaining iso-surface curves that were exported as an STL file for slicing and Rapid

prototyping.

2.2 SURFACE RECONSTRUCTION

Lee and Woo [4] develop a rapid prototyping model directly from point clouds. Their

approach involves slicing of point data and fitting B-spline curve cross-sections. The focus in this

research is on the reduction of cross-sectional data by obtaining intermediate cross-sections by

Homotopy (interpolation) and representing only those cross-sections where the body changes its

curvature significantly. However, the case studies are performed on only simple geometries. Also,

the final output being a layer-wise RP model, can be used only for fabrication and not suitable for

making engineering design changes on the part if needed. Wu et al. [5] developed a rapid

prototyping model by directly slicing a point cloud with adaptive layer thickness. The focus of this

research was the use of adaptive layering. The polygonal curve profiles on every layer were

14
generated using a nearest neighbor algorithm. The nearest neighbor method has its limitations and

can only apply to cross-sections without too much variation in curvature. Case studies were

performed on simple geometries. Lee [6] presented a method for curve reconstruction from

unorganized point clouds using a moving-least squares approach and point cloud thinning. The

resultant points are approximated with a quadratic B-Spline curve. This approach works for limited

geometries such as pipes. Yuwen et al. [26] propose a B-spline surface reconstruction method by

direct slicing of point clouds, obtaining sectional contours using a correlated point pair region

approach, and fitting a surface over the contours. Eck and Hoppe [7] develop an automatic

reconstruction of B-Spline surfaces using point clouds of arbitrary topological type. This research

provides the best available solution of bringing point clouds into the CAD domain. Their approach

involves the construction of tensor product B-Spline patches, connected with tangential (G1)

continuity. The process involves parametrization over a triangular domain and re-parametrization

over a quadrilateral domain before fitting a B-Spline surface by minimizing a least-squared

distance function. The solution is mathematically robust; however, it relies on the processing of a

mesh which is obtained by the Delaunay triangulation of point clouds. A good mesh is required

for good surface fitting and to start with a good point cloud is required for a good mesh. Imperfect

scanning methods and practical limitations of scanners result in noisy, half-formed point clouds

that need to be registered together to visualize an object completely. Meshes obtained from point

clouds are rarely tear-free and crease-free. Ma and Kruth [27] developed a methodology for the

fitting of Non-Uniform Rational B-Spline (NURBS) surfaces on point clouds using a symmetric

eigenvalue decomposition method to obtain weights. Then, the weights are used as known

parameters to solve for control points. It was pointed out that the approach is a trade-off between

smoothness and accuracy of the final geometry.

15
2.3 CURVE RECONSTRUCTION FROM UNORGANIZED POINTS

Edelsbrunner et al. [8] first introduced the concept of “Alpha shapes,” an algorithm to

obtain the shape of a set of unorganized points in the form of a line graph. Yang et al. [9] proposed

a model to approximate point clouds with an implicit B-Spline curve using a geometric distance

minimization problem formulation.

16
3 METHODOLOGY

In Additive Manufacturing, a file in STL format is sliced thinly with planes in-order to

obtain slice contours. The AM machine reads each slice contour and deposits material in a slice

by slice manner until the entire part is manufactured. The approach in this thesis is inspired by

the steps and the process of additive manufacturing. To start with, slices are obtained by

processing point cloud data, which is followed by interpolating closed periodic B-spline curves

onto the contour points on each slice. Resultant slice curves are then extruded or lofted from

one slice to the next until a solid CAD body is generated. The steps in the methodology of

proposed reverse engineering approach are shown as a flowchart in Fig 3.

Fig 3 Flowchart of the steps in the methodology of this research (adapted bunny from [10])

The methodology is explained with the use of a case study. For the first case study, a standard

point cloud dataset – The Stanford bunny [10] is chosen.

17
3.1 3-D SCAN DATA POINTS

Initially, the scan data file is read and plotted for visualization, as shown in Fig 4. A slicing

direction is chosen and the span of the point cloud in that direction is calculated. A desired number

of slices are chosen, and the span of the point cloud is divided by the number of slices to obtain

the slice thickness as shown in Fig 4. The slice thickness can be changed based on the chordal

error calculated after conversion to NURBS geometry. Equation 1 presents the calculation of slice

thickness based on the number of slices and the bounds of the scan data.

T = (Ymax – Ymin) / N -----(1)

where Ymin = Minimum Y coordinate of the point cloud, Ymax = Maximum Y coordinate of the

point cloud, and N = the desired number of slices on the point cloud.

Fig 4. Slicing direction, the orientation of slicing planes and slice thickness of the Stanford

bunny (adapted from [10])

18
3.2 SLICING AND PROJECTION

Unlike STL slicing, where every triangle of the STL is intersected with a slicing plane, a

point cloud is only a set of point coordinates in 3D space and cannot directly intersect with a plane.

To slice a point cloud with a slice thickness ‘t’, parallel planes are created over the point cloud

span in the slicing direction. The points lying in space inside each set of parallel planes are then

projected onto the bottom plane of its respective slice as shown in Figures 5 and 6. The

transformation equations and steps required to project the points lying within an individual slice

onto its slice plane are outlined below.

In the bunny’s case, the first plane lies on the XZ-plane or Y = 0 plane. Subsequent planes

are incremented in height from Y = 0 plane by a constant, user-defined slice thickness ‘t’. The

transformation equation (2) for slicing and projection is given by:

Projected slice points = Sliced scan points x T x Py x T-1 -----(2)

where T = translation matrix; Py = projection matrix; T-1 = inverse of the translation matrix

1 0 00
0 1 00
T=[ ]
0 0 10
0 (−ℎ) 0 1

Here, T translates every slice plane to coincide with the XZ Plane and h = height of the slice from

the Y = 0 plane

1000
0000
Py = [ ]
0010
0001

where Py is the matrix for projecting the points onto the Y = 0 plane. The points lying inside every

slice are projected on the bottom slice using this matrix.

19
Fig 5 Slice points and the projection of slice points on to the bottom slicing plane

Fig 6 Sliced and projected scan data points of the bunny

We now have a set of unordered points on each plane. In the next step, closed curves need to

be constructed using these clusters of points, and an outer shape must be formed from these clusters

of points. This information is simple to deduce for a human eye; however, the algorithm requires

additional information to process the abstract concept of a “shape.” To join a set of points with a

spline curve, information about how the points are ordered must be known. After a literature

review, it was found that Alpha-shapes can generate a shape from a set of unordered points [8].

20
3.2.1 ALPHA SHAPES

Alpha-shapes present the notion of the “shape” or the concave-hull of a finite set of points in a

plane. Herbert Edelsbrunner [11] explains the concept of alpha shape intuitively and presents an

algorithm to compute them. The alpha-shape depends on the shape parameter α, which is a small

positive real number that controls the desired level of detail of the polygon. The α-hull of a set of

points is the intersection of all closed disks with a radius of 1/α, which contains all the set of points

S [8, 12]. The “eraser” intuition is taking a disk of a given radius and rolling the disk over the set

of unorganized points as shown in Figure 7. If the radius of the disk is greater than the distance

between a pair of points on the point-set, the disk remains over the “boundary”; if the radius of the

disk is smaller than the pair of points in the point-set, the disk “falls inside” the cloud and lands

on a pair of points where the radius is greater than the distance between the points. Joining of such

pair of points results in a straight-line graph or a polygon defines the boundary of the unorganized

point cloud.

The set of alpha-shapes for an unorganized point-set is not unique, but a spectrum. The

parameter α can be adjusted to best fit our requirements. Alpha-shape gives the order in which the

unordered points must be joined to form a boundary. The α – neighbors are points joined with

straight line segments, thus forming a polygon that defines the shape. When the parameter α

becomes smaller, the radius of the disk 1/α becomes larger. As R→∞, the alpha-shape becomes

the convex hull; also, as R→0, the alpha-shape becomes the point-set itself. Fig 7 shows the

convex hull and an alpha shape for α = some positive real number, for the same set of points.

21
Fig 7 Alpha-shapes by the union of disks R = 1/α

If a given slice has multiple clusters of points, the algorithm interprets all the clusters on

that slice as one and generates an alpha-shape for the entire set. As seen in Fig 9, calculating the

alpha-shape for points on every slice without separating the point clusters results in the incorrect

joining of the bunny’s ears. The point-set must be separated into its apparent individual clusters

on its slice and an alpha-shape must be calculated for each cluster. Fig 8 (a) shows B-Splines fitted

onto the points of every slice and Fig 8(b) depicts the slice curves obtained using alpha-shape

versus the desired output on a single slice at the bunny’s ears. It is evident that the points need to

be clustered appropriately in order to separate the bunny's ears.

22
(a) (b)

Fig 8 Alpha shapes without point cluster separation

3.3 IMAGE PROCESSING

In the next step, the orthogonal view of the scan data cloud of the bunny is converted to

image data in order to extract the silhouette details of the bunny. The silhouette information will

be used to form clusters of points that separate the object features on each slice. To solve the

problem of point cluster separation, a colored image of the sliced point cloud of the entire part in

an orthogonal view is saved and analyzed to extract useful layer-wise cluster points. The

orthogonal view image of the bunny is shown in Fig 9.

23
Fig 9 Image of Front View of the bunny

A color image is represented by 24 bits with colors ranging from 0 representing black pixels

and 255 representing white pixels. A color image is represented in RGB format with each color

Red, Blue, and Green consisting of a matrix representation of 256 values or shades. To convert an

RGB image to a grayscale image, the weighted method or Luminosity method is used. The Green

component of the image is given more weightage. As shown in Fig 10, using equation (3), the

colored image is converted into a Grayscale image using the luminosity method proposed in [23,

13]:

Grayscale image = ((0.3 * R) + (0.59 * G) + (0.11 * B)) -----(3)

Fig 10 Grayscale image

24
A grayscale image is represented by 8 bits with gray values ranging from 0 to 255. To

convert a grayscale image to a binary image, the pixel values must be thresholded. The gray color

threshold value chosen is the average of 0 and 255 (~128). The grayscale image is then converted

into a binary image (as shown in Fig 12) using a simple binary thresholding method as shown in

equation (4):

Binary image = Grayscale image < 128 -----(4)

A binary image is represented by either 0 (black) or 1 (white) values. The image is

represented by 2 bits. The binary image derived from the grayscale image is shown in Fig 11.

Fig 11 Binary image

Next, the binary image is then subjected to a simple raster scan to extract the silhouette information

of the bunny.

The algorithm to obtain the image silhouette is outlined below and is also presented in Fig 12.

Starting from the top left corner and scanning left to right from the first pixel of the image to last,

white pixel location whenever there is a black pixel adjacent to white or a white pixel adjacent to

black is recorded. We thus obtain the silhouette of the object from the white pixels identified. In

Fig 13, the calculated silhouette points are shown in red. A magnified image of a single slice of

the bunny, along with the silhouette of that slice is also shown in Fig 13.

25
Fig 12 Flow-chart of the silhouette processing algorithm

Fig 13 Enlarged view of a single slice of the image of the bunny along with silhouette points

3.4 IMAGE AND SCAN DATA SUPER-IMPOSITION

The silhouette points obtained by image processing need to be superimposed with the sliced

scan data points to extract point clusters belonging to separate features of the part. As the silhouette

data comes from an image, it belongs to a different coordinate system from the sliced scan data.

As seen in Fig 14, the silhouette data is transformed (rotated, translated, and scaled) with respect

to the scan data. The two data sets can be superimposed using geometric transformations.

26
Fig 14 Silhouette data and scan data plotted together

Fig 15 shows the way data is stored in matrices for a cartesian coordinate system and pixel

coordinate system. Therefore, the silhouette co-ordinates are rotated by 180 degrees when plotted

together with the scan data, which lies in the cartesian coordinate system.

Fig 15 Co-ordinate systems

To superimpose both sets of data, necessary transformations must be performed on the

silhouette data. The transformation equations adapted from [14] are used to superimpose the

silhouette points with sliced scan data as outlined in the steps below:

27
1) Rotate the silhouette coordinates about the Z-axis to orient with sliced scan data. The rotation

is performed about the Z-axis because the front view of the bunny is on the XY-Plane. The

transformation is given by equation (5):

Rotated Silhouette = Silhouette x Rz -----(5)

Rz = [cos(theta) sin(theta)

-sin(theta) cos(theta)]

where Rz = Rotation matrix; theta = angle of rotation about the Z-axis (180 degrees)

The set of silhouette points obtained from the image processing step is given by

Silhouette = [(x1, y1), (x2, y2), (x3, y3), ….., (xn, yn)]

The rotated silhouette coordinates are shown in Fig 16.

Fig 16 Rotation of silhouette

2) Next, translate the origin of silhouette coordinates to the origin of 3D space (0,0,0) as given by

equation (6):

Translated Silhouette = Rotated Silhouette + Tsil -----(6)

28
where, Tsil = [-Silx - Sily], Tsil = Translation matrix, Silx = minimum x co-ordinate of silhouette

data, Sily = minimum y co-ordinate of silhouette data.

The translation of the origin of sliced scan data points to the origin of 3D space (0,0,0) is given by

equation (7):

Translated Scan = Sliced scan + Tscan -----(7)

where Tscan = Translation matrix = [-Scanx - Scany - Scanz], Scanx = minimum x co-ordinate of

scan data, Scany = minimum y co-ordinate of scan data and Scanz = minimum z co-ordinate of

scan data.

The translated silhouette and scan data are shown in Fig 17.

Fig 17 Translated silhouette and scan data to the origin

3) Now, the silhouette data and scan data now have the same origin (0,0,0). The two data sets

must be scaled for exact superimposition. The scaling approach is adapted from [15] and the

scaling matrix is obtained using equations (8) and (9) [15]. The transformation equation for

scaling is given by equation (10) [15].

Scaling factor x = Scan x length / Silhouette x length -----(8)

29
Scaling factor y = Scan y length / Silhouette y length -----(9)

where, Silhouette x length = max of Silhouette x co-ordinate – min of Silhouette x co-ordinate

Silhouette y length = max of Silhouette y co-ordinate – min of Silhouette y co-ordinate

Scan x length = max of Scan x co-ordinate – min of Scan x co-ordinate

Scan y length = max of Scan y co-ordinate – min of Scan y co-ordinate

Scaling matrix S is given by:

S = [Scaling factor x 0

0 Scaling factor y]

Superimposed Silhouette = Translated Silhouette * S -----(10)

The superimposed sliced scan data and silhouette data after performing geometric transformations

are shown in Fig 18.

Fig 18 Superimposed scan and silhouette points

3.5 CURVE DATA EXTRACTION

In this section, the methodology to extract separate point cluster data from the

superimposed scan and silhouette data is explained.


30
3.5.1 STENCIL GENERATION

For a given layer, two consecutive sets of silhouette points in a single slice are used to form

a rectangle. The sliced and projected 3-D scan data points lying within the bounds of each such 2-

D rectangle represents a cluster. The set of these generated rectangles for the entire part acts as a

stencil or sieve from which the scan points can be filtered. From the set of scan points from all the

slices, the points lying inside of each rectangle are isolated using the point in polygon test. Thus,

there are as many rectangles as there are clusters in the sliced scan data. The rectangles generated

using silhouette data are shown in Fig 19. Each rectangle is generated by joining the silhouette

coordinates in that slice using the following equations:

Rectangle Vertices = [(min x, min y), (max x, min y), (max x, max y), (min x, max y)]

Let, scan data point = (x, y, z). A scan point lying inside the rectangle is determined by:

Min x of Rectangle < x of Scan data point < Max x of Rectangle; and

Min y of Rectangle < y of Scan data point < Max y of Rectangle

Fig 19 Enlarged view of a single slice of the super-imposed scan and silhouette data

The scan data lies in-between two sets of silhouette data and is used for creating the rectangular

bounding box. However, in some cases, due to inherent variability in calculation of silhouette
31
points using image pixels, the scan data may lie just on the edge or sometimes outside of the

generated rectangle. To overcome this problem, a small but proportionate tolerance to the size of

the slice is provided on every rectangle to ensure that the scan data lies wholly inside. Figure 19

shows a rectangle with some tolerance created over the silhouette points. The points lying inside

each rectangle represents one island or cluster of points on the slice of the bunny. The stencil

generated is shown in Fig 20. The equations for adding a small tolerance on every rectangle and

performing the point in rectangle test are given by equations (11) and (12):

(Min x of Rectangle – tolerance) < x of Scan data point < (Max x of Rectangle + tolerance) –(11)

(Min y of Rectangle – tolerance) < y of Scan data point < (Max y of Rectangle + tolerance) –(12)

Fig 20 Stencil generation

3.5.2 ALPHA-SHAPE AND SPLINE INTERPOLATION

Thus, using point in polygon test, every scan points cluster was separated. In the next step,

the alpha-shape is calculated for every cluster [16] to obtain the outer shape of the cluster. The

alpha-shape gives the order of joining the points on the boundary of the point cluster. Using this

order, the points are joined with a periodic interpolating cubic spline curve [17, 18]. Fig 21 (a)

shows the alpha shape for a single slice of the bunny and Fig 21 (b) shows a 3rd degree B-Spline

32
fitted on the obtained alpha-shape points to form the outer contour. Fig 22 (a) shows scan data

points isolated within each rectangle or an island and Fig 22 (b) depicts the closed B-Splines fitted

onto the scan data points of each island.

(a) (b)

Fig 21 Alpha shapes and B-spline interpolation

Fig 22 Stencil and sliced scan data

33
3.6 GENERATION OF SOLID CAD BODY

We now have a set of layer-wise stacked contours. These contours are imported into CAD

software (using NX Journaling). In the next step, these contours must be extruded from one layer

to another to form a solid surface. A solid sweep of one curve over another curve is the CAD

operation called “Extrude.”

3.6.1 EXTRUSION:

The mathematical representation of the CAD operation Extrude (or Pad or Sweep in

various CAD environments) is called as “The general cylinder” in [19].

“Let W be a vector of unit length and C(u) = ∑ni = 0 R i, p (u) Pi be a pth-degree NURBS curve on

the knot vector U, with weights wi. The extrusion S(u, v) is obtained by sweeping (translating)

C(u) a distance d along W. The sweep direction is denoted by the parameter v, where 0 ≤ v ≤

1.”[19]

The extrusion surface is represented by equation (13) [19]:

S(u, v) = ∑n i = 0 ∑1 j = 0 R i, p; j, 1 (u, v) P i, j -----(13)

Fig 23 illustrates the generation of a cylinder by extruding a circular curve over a straight-line path

using Siemens NX.

Fig 23 Extruded Cylinder (rendered in NX software)

34
The conversion of curves to a solid body is performed by importing the B-Spline curves

into a CAD environment and extruding each slice by a distance equal to the original slice thickness

‘t’ taken during the slicing and projection step. A solid CAD body of the bunny is thus generated

and is shown in Fig 24. Table 1 depicts the number of points, slice thickness, and number of slices

of the bunny.

(a) (b)

Fig 24 CAD Bunny

Table 1 Stanford Bunny case 1

Case study Number of points Slice thickness (units) Number of slices

Human hand 35,947 0.003 52

35
4 CASE STUDIES AND RESULTS

In this chapter, the proposed methodology was tested using four case studies, and the accuracy

of the conversion was validated by calculating the root mean square error of the CAD model when

compared to the scan points.

4.1 STANFORD BUNNY

The case study 1, Stanford Bunny was repeated with several finer slice thicknesses. The result

obtained for a smaller slice thickness is shown in Fig 25. Table 2 depicts the number of points,

slice thickness, and the number of slices of the bunny.

Fig 25 The Stanford bunny case study with a smaller slice thickness

Table 2 Stanford Bunny case 2

Case study Number of points Slice thickness (units) Number of slices

Stanford bunny 35,947 0.002 78

36
Upon bringing the curves into the CAD environment, the curves may be either extruded or

lofted to produce NURBS geometry. The method to be used depends on the overall shape of the

object and the lofting suitability of the cross-sections generated. The layer-wise curves may need

to be re-parametrized in order to have the same degree and same knot vector to obtain a smooth

loft surface.

4.1.1 LOFTING

The mathematical representation of “lofting” CAD operation is referred to as “The Ruled

Surface” in [19]. “Let C1 (u) and C2 (u) be two NURBS curves defined on the knot vectors U1 and

U2 respectively. The ruled surface in v direction, is a linear interpolation between C1 (u) and C2

(u). The interpolation requires to be between points of equal parameter value.” [19] The desired

surface form is given by equation (14) [19]:

S(u, v) = ∑n i = 0 ∑1 j = 0 R i, p; j, 1 (u, v) P i, j -----(14)

Fig 26 illustrates the generation of a lofted surface between two randomly shaped planar curves in

Siemens NX.

Fig 26 Lofted surface (rendered in NX software)

For the Stanford bunny, using the lofting operation resulted in folds and creases over the bunny’s

surface, as shown in Fig 27.

37
Fig 27 Lofted Bunny

4.2 HUMAN HAND

Fig 28 and Fig 29 show the CAD surfaces obtained from the point cloud of a human hand [28].

Upon obtaining the layer-wise curves, the final geometry was obtained by both lofting and

extrusion. Table 3 presents the number of points, slice thickness, and the number of slices of the

hand.

Fig 28 Lofted Human hand a) point cloud (adapted from [28]) b) curves c) loft geometry

38
Fig 29 Extruded Human hand a) point cloud (adapted from [28]) b) curves c) extruded geometry

Table 3 Human hand

Case study Number of points Slice thickness (units) Number of slices

Human hand 40,000 5 36

4.3 TURBINE BLADE

Fig 30 shows the scan point cloud data of a turbine blade [29], the sliced curves, the result of

using both extrusion and lofting to generate a solid CAD body. Table 4 depicts the number of

points, slice thickness, and the number of slices of the turbine blade.

Fig 30 Turbine blade a) point cloud (adapted from [29]) b) curves c) CAD geometry

39
Table 4 Turbine blade

Case study Number of points Slice thickness (units) Number of slices

Turbine blade 882,954 30 20

4.4 TABLE

Fig 31 shows the scan point cloud data of a four-legged table [28], the sliced scan point data

along with generated stencils in both front and side views.

Fig 31 4-Legged table a) point cloud (adapted from [28]) b) front and side view stencils

For objects such as this table, image processing of the front view alone does not provide complete

information about the object. The 4 table legs need to be separated to have 4 sets of curves on

every slice. The B-spline curves resulting from the layer-wise alpha-shapes upon analyzing the

front view and side view stencils separately are shown in Fig. 32.

40
Fig 32 Resultant curves obtained from the front and side-view stencils of the table

The intersection of the point clusters lying inside a single rectangle of a single slice Si obtained

from the front view and its corresponding slice obtained from the side view will result in the

separation of all 4 legs. An enlarged view of the rectangles in the front and side views of the table

is shown in Fig 33 and Fig 34.

Fig 33 Enlarged view of the stencil of table legs in the front view

41
Fig 34 Enlarged view of the stencil of table legs in the side view

The algorithm to separate point clusters with the data obtained from front and side view stencils is

outlined below:

Let, the set of points lying inside one rectangle R1 of a single slice Si in the front view be

represented as FVR1, and the set of points lying inside one rectangle R1 of the same slice Si in the

side view be SVR1. Two clusters are separated from the intersection of point sets FVR1 and SVR1,

given by equations (15) and (16)

C1 = FVR1 ∩ SVR1 -----(15)

C2 = FVR1 – C1 -----(16)

Let, the set of points lying inside the second rectangle R2 of the slice Si in the front view be

represented as FVR2 and the set of points lying inside the second rectangle R2 of the slice Si in the

side view be SVR2. Two clusters are separated from the intersection of point sets FVR2 and SVR2,

given by equations (17) and (18)

42
C3 = FVR2 ∩ SVR2 -----(17)

C4 = FVR2 – C3 -----(18)

where, C1, C2, C3, and C4 are the separated point clusters.

The alpha-shapes of the separated point cluster sets C1, C2, C3, and C4 are calculated. In the next

step, a cubic periodic B-spline interpolation of the points is obtained from the alpha-shapes which

result in the curves as shown in Fig 35. Table 5 depicts the number of points, slice thickness, and

the number of slices of the table.

Fig 35 Extruded CAD geometry of the table

Table 5 Four-legged table

Case study Number of points Slice thickness (units) Number of slices

Table 499,899 15 46

43
5 VALIDATION OF CAD GEOMETRY

Validation is an important step in determining the accuracy of the resultant 3-D CAD

geometry with respect to the original scan point cloud data. To perform validation, the original

input scan point cloud and the points generated from the surface of the extruded NURBS geometry

are registered using the Iterative Closest Point algorithm adapted from [20]. The Root mean square

(RMS) distance error and the average distance error between the original scan data and obtained

CAD geometry is calculated. An implementation of this algorithm, available as a Siemens NX

application developed by Zhang [21] was used to validate the results. Case studies have been

performed on the extruded geometries of the Stanford bunny, the hand, and the table.

5.1 STANFORD BUNNY – VALIDATION

A case study was performed on the Stanford bunny for Root Mean Squared distance Error

(RMSE) calculation between the original scan point cloud and a point cloud of the surface of

NURBS geometry. The span of the point cloud bunny is shown in Fig 36. The RMSE obtained for

the bunny on geometries obtained at various slice thicknesses are shown in Fig 37 and the errors

are tabulated in Table 6.

Fig 36 Bounding box – bunny (adapted from [10])


44
Fig 37 Point registration for geometries obtained using different slice thicknesses for bunny

Table 6 Root mean squared distance error calculations for the bunny

Model Slice Slicing Number of Span in Average RMSE


thickness direction slices the distance (units)
(units) slicing error (units)
direction
Stanford 0.002 Y-axis 77 0.1543 0.0023 0.005
bunny
Stanford 0.003 Y-axis 51 0.1543 0.002 0.0047
bunny
Stanford 0.004 Y-axis 38 0.1543 0.0022 0.0049
bunny
Stanford 0.005 Y-axis 30 0.1543 0.0025 0.0051
bunny

45
5.2 HUMAN HAND - VALIDATION

The RMSE was calculated for the hand by slicing along the Y-axis at a thickness t = 5mm.

The span of the point cloud bunny is shown in Fig 38. The results are shown in Fig 39 and RMSE

values are tabulated in Table 7.

Fig 38 Bounding box – hand (adapted from [28])

Fig 39 Point registration for extruded hand geometry

46
Table 7 RMSE of the human hand

Model Slice Slicing Number of Span in Average RMSE


thickness direction slices the distance (units)
(units) slicing error (units)
direction
Human 5 Y-axis 36 183.1419 0.6146 0.9765
hand

5.3 TABLE - VALIDATION

The RMSE was calculated for the four-legged side table sliced along the Y-axis at a

thickness t = 15. The span of the point cloud is shown in Fig 40. The results after point registration

are shown in Fig 41 and RMSE tabulated in Table 8.

Fig 40 Bounding box – table (adapted from [28])

47
Fig 41 Point registration for extruded table geometry

Table 8 RMSE of the table

Model Slice Slicing Number of Span in Average RMSE


thickness direction slices the distance (units)
(units) slicing error (units)
direction
Table 15 Y-axis 46 702.6620 4.7253 6.1409

5.4 RESULTS – DISCUSSION

Based on RMSE calculations, the following observations can be made:

5.4.1 OPTIMUM SLICE THICKNESS FOR ACCURACY OF FINAL GEOMETRY

Table 6 shows that the least RMSE value for the bunny occurs at a slice thickness of t =

0.003mm. Decreasing the slice thickness to t = 0.002mm increases the error. This may be due to

the sparsity of data points between slices as the slice thickness decreases. Increasing the slice

thickness further to t = 0.004mm and t = 0.005mm also results in increasing the error value. The

48
conclusion is that for the bunny, t = 0.003mm is the optimum slice thickness for reverse

engineering.

5.4.2 IMPACT OF NUMBER OF LAYERS ON ACCURACY

Upon reducing slice thickness of the bunny to less than t = 0.002mm, the points on every

layer become too sparse for a good approximation of the boundary of sliced scan data. In the

calculation of silhouette points in the image processing step, the pixels appear to stay too close,

thus interfering between layers. This is a limitation of the methodology used for the clustering of

points in the image processing step. Therefore, slice thickness cannot be reduced below a certain

value for a point cloud. Upon increasing the slice thickness to more than t = 0.003mm, the error

keeps increasing as expected, resulting in an inaccurate build geometry.

5.4.3 ADAPTIVE SLICING SCHEME

An adaptive slicing approach, where slice thickness is smaller on curved sections defining

critical layers of the part and larger on the cross-sections that do not affect a significant

area/curvature change on the part geometry needs to be studied. Siraskar et al. [22] outline a

scheme for adaptive slicing for AM, although the analysis was performed using STL and octree

data structures. Further study is required to link the adaptive slicing scheme with this approach of

reverse engineering of point clouds.

5.4.4 IMPACT OF BUILD DIRECTION ON ACCURACY OF FINAL GEOMETRY

The bunny was sliced along the build directions of X and Y axes with the same slice

thickness t = 0.005mm. The RMSE of the final geometry was calculated and compared to identify

49
an optimal slicing scheme. The results after slicing in X and Y directions and point registration for

RMSE calculation are shown in Fig 42. The obtained RMSE values are tabulated in Table 9.

Fig 42 Slicing schemes and point cloud registration of the bunny

Table 9 RMSE for different build orientations of the bunny

Model Slice Slicing Number of Span in Average RMSE


thickness direction slices the distance (units)
(units) slicing error (units)
direction
Stanford 0.005 Y-axis 30 0.1543 0.0025 0.0051
bunny
Stanford 0.005 X-axis 31 0.1557 0.0017 0.0025
bunny
50
It is observed that slicing the bunny along X-axis resulted in about a 50% decrease in the

RMSE value. Therefore, it is concluded that for the same slice thickness, the slice direction plays

a significant role in the accuracy of final geometry.

The hand was sliced along both X and Y axes as shown in Fig 43 and the RMSE results are

tabulated in Table 10.

Fig 43 Slicing schemes and point registration of the hand

51
Table 10 RMSE for different build orientations of the hand

Model Slice Slicing Number of Span in Average RMSE


thickness direction slices the distance (units)
(units) slicing error (units)
direction
Human 5 Y-axis 36 183.1419 0.6146 0.9765
hand
Human 5 X-axis 31 158.4005 1.1982 1.7446
hand

From Table 10, we can conclude that it is better to slice the hand along the Y-axis at the slice

thickness considered.

52
6 CONCLUSIONS AND FUTURE SCOPE

This thesis proposes a methodology to reverse engineer a point cloud directly into CAD

geometry, without the necessity of obtaining the data in a 3D mesh representation. This approach

involves processing a 3-D scan data cloud by applying geometric transformations to obtain a 2-D

stacked point cloud layers. The point cloud lying on the layers or slices are clustered as per the

part geometric features using an image processing methodology. Then, the separated point clusters

are processed using alpha shapes to determine the contour or shape of the part feature on that slice.

Calculating the shape implies obtaining point connectivity from a set of unorganized points in a

plane. Once the connectivity information is obtained, the points are interpolated with a cubic

periodic B-spline to form smooth layer-wise curves. The curves are used to generate a part

geometry similar to the layer-wise additive manufacturing process by extrusion or lofting in a CAD

environment.

Although alpha-shapes are closely connected to Delaunay triangulation of point sets, the

problems that occur with STL meshes such as extensive manual repair to obtain a manifold 3-D

mesh from a point cloud, inverted triangular face normals, inconsistent triangular mesh edge

connectivity, mesh tears, etc. are eliminated. Because the final geometry is in the CAD domain,

any modification of the part design becomes a simple routine of using the CAD software. A CAD

file format (Neutral : IGES, STEP; Native : .prt, .jt, .x_t, .dwg, etc.) can be easily translated into

other file formats for manufacturing purposes (AM : STL, AMF, 3MF, Collada, etc.). The final

geometry has been validated by the registration of the original scan point cloud with obtained CAD

geometry for calculation of root mean squared distance error.

53
As Additive Manufacturing principles are applied in this approach, future scope points toward

the quantification of step errors due to layer-wise CAD extrusion, and an adaptive slicing scheme

to mitigate the error, i.e., generation of finer layers on critical part regions that require a higher

surface detail, and thicker layers where the cross-section of the part region does not vary

considerably can be studied. This thesis methodology only applies to surface level scan data such

as the data obtained from laser scanners and CMMs. This concept can be further advanced to apply

to the data obtained from an object’s interior such as CT/MRI scans.

54
7 REFERENCES

[1] Tamas Varady, Ralph R Martin and Jordan Cox, 1997, “Reverse Engineering of geometric
models – an introduction” Computer-Aided Design, Vol. 29, No.4, pp. 255-268.
[2] Lee SH, Kim HC, Hur SM, Yang DY. STL file generation from measured point data by
segmentation and Delaunay triangulation. Computer-Aided Design. 2002 Sep 1;34(10):691-704.
[3] Li L, Schemenauer N, Peng X, Zeng Y, Gu P. A reverse engineering system for rapid
manufacturing of complex objects. Robotics and Computer-Integrated Manufacturing. 2002 Feb
1;18(1):53-67.
[4] Lee KH, Woo H. Direct integration of reverse engineering and rapid prototyping. Computers
& Industrial Engineering. 2000 Jan 1;38(1):21-38.
[5] Wu YF, Wong YS, Loh HT, Zhang YF. Modeling cloud data using an adaptive slicing
approach. Computer-Aided Design. 2004 Mar 1;36(3):231-40.
[6] Lee IK. Curve reconstruction from unorganized points. Computer-aided geometric design.
2000 Feb 1;17(2):161-77.
[7] Eck M, Hoppe H. Automatic reconstruction of B-spline surfaces of arbitrary topological type.
InProceedings of the 23rd annual conference on Computer graphics and interactive techniques
1996 Aug 1 (pp. 325-334).
[8] Edelsbrunner H, Kirkpatrick D, Seidel R. On the shape of a set of points in the plane. IEEE
Transactions on information theory. 1983 Jul;29(4):551-9.
[9] Yang Z, Deng J, Chen F. Fitting unorganized point clouds with active implicit B-spline curves.
The Visual Computer. 2005 Sep 1;21(8-10):831-9.
[10] Turk G, Levoy M. The Stanford bunny.
[11] Edelsbrunner H. Alpha shapes—a survey. Tessellations in the Sciences. 2010 Jan;27:1-25.
[12] Akkiraju N, Edelsbrunner H, Facello M, Fu P, Mucke EP, Varela C. Alpha shapes: definition
and software. InProceedings of the 1st International Computational Geometry Software Workshop
1995 Jan 20 (Vol. 63, p. 66).
[13] MATLAB Image Processing Toolbox Release 2018b, The MathWorks, Inc., Natick,
Massachusetts, United States.
[14] Rogers DF. Mathematical elements for computer graphics.
[15] Vaidya R, Anand S. Image processing assisted tools for pre-and post-processing operations
in additive manufacturing. Procedia Manufacturing. 2016 Jan 1;5:958-73.
[16] MATLAB Computational Geometry Toolbox Release 2018b, The MathWorks, Inc., Natick,
Massachusetts, United States.
[17] Lee ET. Choosing nodes in parametric curve interpolation. Computer-Aided Design. 1989 Jul
1;21(6):363-70.

55
[18] MATLAB Curve Fitting Toolbox 2018b, The MathWorks, Inc., Natick, Massachusetts,
United States.
[19] Piegl L, Tiller W. The NURBS book. Springer Science & Business Media; 2012 Dec 6.
[20] Besl PJ, McKay ND. Method for registration of 3-D shapes. InSensor fusion IV: control
paradigms and data structures 1992 Apr 30 (Vol. 1611, pp. 586-606). International Society for
Optics and Photonics.
[21] Botao Zhang, Iterative Closest Point algorithm : Siemens NX application, Center for Global
Design & Manufacturing, University of Cincinnati
[22] Siraskar N, Paul R, Anand S. Adaptive slicing in additive manufacturing process using a
modified boundary octree data structure. Journal of Manufacturing Science and Engineering. 2015
Feb 1;137(1).
[23] Pratt WK. Digital image processing Wiley-Interscience.
[24] Meshlab, Rhino, Geomagic
[25] Puntambekar NV, Jablokow AG, Sommer III HJ. Unified review of 3D model generation for
reverse engineering. Computer Integrated Manufacturing Systems. 1994 Nov 1;7(4):259-68.
[26] Yuwen S, Dongming G, Zhenyuan J, Weijun L. B-spline surface reconstruction and direct
slicing from point clouds. The International Journal of Advanced Manufacturing Technology.
2006 Feb 1;27(9-10):918-24.
[27] Ma WY, Kruth JP. NURBS curve and surface fitting for reverse engineering. The
International Journal of Advanced Manufacturing Technology. 1998 Dec 1;14(12):918-27.
[28] Hand, Classic side table downloaded from https://www.artec3d.com/3d-models/ply
[29] Turbine Blade downloaded from http://graphics.im.ntu.edu.tw/~robin/courses/gm05/model/

56

You might also like