top of page

Interests: Physics Based Modeling | Animation | Medical Imaging

Low Cost 3D Ultrasound 

Screen Shot 2019-12-02 at 3.45.31 PM.png

reconstructed vessels

Conventional two-dimensional (2D) ultrasound imaging is relatively safe, economical, and fast yet operator dependent as it can only yield selective cross-sectional images of a complete three-dimensional (3D) anatomic volume. 3D ultrasound imaging can capture a region of interest more thoroughly. While 3D ultrasound imaging acquisition and reconstruction methods have been developed, the previous methods can be expensive, mostly ranging from $10,000 to $75,000. 

In 2019 summer, I got to work on reimplementing the reconstruction algorithm for low cost 3D ultrasound under Dr. Herickhoff. Two Inertial Measurement Units (IMUs), and two optical trackers were attached to an ultrasound transducer to collect the movement and the orientation of the transducer. Each IMU cost $7.4, each optical tracker cost $6.6.

Here, I hope to share what I enjoyed learning throughout my summer.

The Hardware

Once we find the IMU orientation relative to gravity and the earth's magnetic field using the Madgwick's filter and the Ellipsoid Fitting Method, we can rotate the IMU frame to our reference frame. With one IMU, we can reconstruct the path of the transducer, and with the other, we can place the transducer on the path with proper orientation. We can then place the 2D ultrasound images we obtained through a video grabber accordingly, or map the RGBA values of each pixel to a voxel (3D pixel), and obtain a 3D model. 

Results
dd.png
Screen Shot 2019-09-25 at 12.07.14 PM.pn
Screen Shot 2019-09-25 at 12.07.46 PM.pn
Screen Shot 2019-09-25 at 12.08.01 PM.pn
Screen Shot 2019-09-25 at 12.08.23 PM.pn

reconstructed belly with gap at different angles

The reconstructed models seem to be pretty continuous, qualitatively. Future work involves (i) evaluating the volumetric accuracy of this method using phantoms and (ii) possibly alerting the user for  "missed" spots. 

upload2.png
upload1.png

sensor device attached to the 2D ultrasound probe

An Inertial Measurement Unit (IMU) is a compact sensor composed of accelerometers, gyroscopes, and sometimes magnetometers. It is often used in smartphones, and Nintendo Wii remotes. An optical tracker is a X-Y motion tracking sensor. It is often seen on computer mice.

We attached two LSM9DS1's (IMU's), two PMT9123QS (optical trackers), and a LPC1768, an Arm Cortex-M3 processor (microcontroller) to a traditional 2D ultrasound transducer.

While there is a lot more to the hardware, and the complications associated with the hardware, this is all we need to know to really understand the general idea behind the work: as the traditional 2D ultrasound transducer swipes the surface of interest, the IMUs collect the orientations of the transducer, and the optical trackers collect the positions of the transducer. With the collected orientations and positions, we map the 2D ultrasound images to 3D space. 

Comment

I really enjoyed learning about medical imaging and its applications at the REU. I would like to thank the Stanford Ultrasound Imaging & Instrumentation Lab and the IMMERS Lab for giving me a chance to work on this project and advising me. I would also like to acknowledge Juhong Lin and other lab people for their work on hardware/software. I simply rewrote the reconstruction code outside openGL to write 3D models into OBJ/PLY files.

The Reconstruction Algorithm

Now let us expand on this general idea slightly further. An IMU can provide a complete measurement of the orientation relative to the direction of gravity and the earth’s magnetic field as a gyroscope measures angular velocity (sensor's orientation), an accelerometer measures the earth's gravitational field, and a magnetometer measures the earth’s magnetic field. As these measurements are likely to be subject to high levels of noise, using an orientation filter like the Madgwick's filter [Madgwick 2010] can help us reduce the noise.

The Madgwick's filter utilizes quaternions, or basically "vectors" with four components. A quaternion is of form                                   for                            where behaves like a vector in 3D vector space. Now you might think "Why use quaternions instead of the conventional 3D vectors or coordinates? Three-dimensional space has three axes - we only need three components out of the four to really cover the space."  Well, you are correct. We only need three to span 3D space, but quaternions can help us express rotations in 3D space, in a very simple way. 

Consider a case where we are rotating a point                                                               (corresponding quaternion) around some axis by some angle, say          . Note how we can express the rotation axis in the form a 3D vector, and the rotation angle as a scalar. In this case, we are rotating the point around the i-axis by some angle "1". Multiply           with the quaternion and divide the result by         . We get                                        .  We 

Screen Shot 2019-12-06 at 4.32.28 PM.png
Screen Shot 2019-12-06 at 4.35.29 PM.png
Screen Shot 2019-12-06 at 4.32.28 PM.png
Screen Shot 2019-12-06 at 7.09.53 PM.png
Screen Shot 2019-12-06 at 7.12.23 PM.png
Screen Shot 2019-12-06 at 7.12.23 PM.png
Screen Shot 2019-12-06 at 7.12.23 PM.png
Screen Shot 2019-12-06 at 7.26.24 PM.png
Screen Shot 2019-12-06 at 7.09.53 PM.png

have now rotated                around the i-axis by 90 degrees as we can see below. While expressing the

Screen Shot 2019-12-06 at 8.18.44 PM.png
Screen Shot 2019-12-06 at 8.18.35 PM.png

rotation angle as a scalar is not as intuitive as expressing the rotation axis in the form of a 3D vector,  the relation can be derived by using the absolute value of  the quaternion.

(1,1,0) -> (1,0,1)

Screen Shot 2019-12-16 at 9.15.04 AM.png
Screen Shot 2019-12-16 at 10.20.36 AM.pn
Screen Shot 2019-12-16 at 10.20.39 AM.pn

The derivation is not shown here but we get a sense of what quaternions are and how the notation is convenient when it comes to expressing rotations in 3D space as it can express any "3D vector", rotation axis, and rotation angle.

Similarly, orientations and rotations of frames can be expressed in quaternions and their products. Here we just take the derived equation from the Madgwick's paper:                                       where          and          each denote the same vector in frame A and frame B,

      , the orientation of frame B relative to frame A,

       , the complex conjugate of       , and      , simply the quaternion product. When we apply this equation to our "sensor frame" and "estimated frame" relative to our earth frame, we get 

 

  1.                                                 where

           where         is the angular rate measured by the

           gyroscope at time t in sensor frame.

     2.  gradient descent algorithm

           with n iterations and step size     :

                                                              , k = [1, n]  where

                                                                        

                                                                            and

                                                                 .

                  = (0,0,1) is the gravity.       is the

           accelerometer measurement.

                   

     3.  same as above with       (predefined magnetic

          field) and         (the magnetometer

          measurement) in the place of        and     

          except we do not know         unlike        yet.

This leads us to our next section on finding the earth's magnetic field using the Ellipsoid Fitting Method by Jiancheng Fang et. al [Jiancheng Fang et. al 2011].

Screen Shot 2019-12-16 at 8.11.11 AM.png
Screen Shot 2019-12-16 at 8.14.53 AM.png
Screen Shot 2019-12-16 at 8.14.33 AM.png
Screen Shot 2019-12-16 at 8.14.49 AM.png
Screen Shot 2019-12-16 at 8.14.45 AM.png
Screen Shot 2019-12-16 at 8.14.38 AM.png
Screen Shot 2019-12-16 at 8.14.45 AM.png
Screen Shot 2019-12-16 at 9.15.14 AM.png
Screen Shot 2019-12-16 at 9.21.43 AM.png
Screen Shot 2019-12-16 at 9.49.45 AM.png
Screen Shot 2019-12-16 at 10.13.59 AM.pn
Screen Shot 2019-12-16 at 10.14.16 AM.pn
Screen Shot 2019-12-16 at 10.14.06 AM.pn
Screen Shot 2019-12-16 at 10.20.39 AM.pn
Screen Shot 2019-12-16 at 10.20.36 AM.pn
Screen Shot 2019-12-16 at 10.33.02 AM.pn
Screen Shot 2019-12-16 at 10.32.57 AM.pn
Screen Shot 2019-12-16 at 10.20.36 AM.pn
Screen Shot 2019-12-16 at 10.32.57 AM.pn
Screen Shot 2019-11-26 at 4.06.23 PM.png
Screen Shot 2019-11-26 at 4.06.36 PM.png

collected magnetic field (left) corrected magnetic field (right)

credit: Junhong Lin, now a graduate student @ MIT EECS

We start from the conicoid/ellipsoid equation

The paper uses the least-square method which minimizes the sum of squares of the above to solve for the coefficients.

With the solved coefficients,

                                 &                                                

                                                 

                                                 &

The collected magnetic field and the corrected magnetic field by Junhong Lin can be seen above.

Screen Shot 2019-12-16 at 2.44.55 PM.png
Screen Shot 2019-12-16 at 2.58.28 PM.png
Screen Shot 2019-12-16 at 3.28.56 PM.png
Screen Shot 2019-12-16 at 4.07.39 PM.png
Screen Shot 2019-12-16 at 4.22.22 PM.png
Screen Shot 2019-12-16 at 4.23.47 PM.png
Screen Shot 2019-12-16 at 4.22.14 PM.png
bottom of page