Posted on 1 Comment

CTDIvol vs DLP – a simple explanation

What do they represent? CTDIvol is based on measurements obtained when scanning either a 16cm or 32 cm phantom.   Essentially, it represents scanner output.  DLP is derived from CTDIvol, but incorporates a scan length component.  Both function as reasonable proxies for absorbed dose but do not represent the actual patient dose.  In other words, if your CTDIvol and/or DLP is twice as high as it could be, then the doses you are imparting will be about twice as high as they could be.

Can CTDIvol and DLP results tell me two different things?     Yes.  CTDIvol represents the output when scanning a phantom, while DLP takes into account the scan length.  We’ve seen instances where CTIDvol is considered well within a “normal” range but DLP was unexpectedly high.  We found the scan settings were appropriate for the study, but the exam length  was longer than what others were using.

For example, a Chest CT could be started too high into the neck and end too far into the abdomen.  If this is the case CTDIvol (basic scanner settings) could be just fine, but because scans extended more than necessary above and/or below the requested area, the DLP could easily be too high.


The following information, obtained from, is a more technical discussion of CTDIvol and DLP for those interested.

The following is taken from an article posted on

CT Dose Index Volume (CTDIvol) The CTDIvol can be calculated as: CTDIvol = [(N x T)/I] x CTDIw where CTDIw = weighted or average CTDI given across the field of view N = number of simultaneous axial scans er x-ray source rotation T = thickness of one axial scan (mm) I = table increment per axial scan (mm)

In helical CT the ratio of the I to (N x T) is the pitch; therefore in helical mode:  CTDIvol = (1/pitch) x CTDIw

CTDIvol (or CTDI volume) represents the dose for a specific scan protocol which takes into account gaps and overlaps between the radiation dose profile from consecutive rotations of the x-ray source. Therefore CTDIw represents the average radiation dose over the x and y direction whereas CTDIvol represents the average radiation dose over the x, y and z directions.

Dose Length Product The dose length product (DLP) is the measure of ionizing radiation exposure during the entire acquisition of images.

Therefore, DLP (mGy-cm) = CTDIvol (mGy) x irradiated length (cm) (irradiated length is usually longer than imaged length in helical scanning)

CTDIw and CTDIvol are independent of scan length for determining the total energy absorbed whereas DLP is proportional to scan length.

Need help getting more from your participation in the ACR’s Dose Index Registry® ?  Let Dose Registry Support Services tailor a program designed specifically to help your department succeed.  Contact Dose Registry Support Services to see how we can help.

Posted on Leave a comment

End of Quarter: Time to Perform Your DIR Data Quality Checks

With the end of the quarter approaching the ACR’s NRDR sent a reminder to all participants to perform several quality checks so your report will be as accurate as possible.  These checks include making sure:

  1. The total volume of exams recorded in the DIR for your facility is reasonably close to the volume of studies you’ve performed;
  2. That any currently unmapped studies are mapped to an appropriate RPID; and,
  3. That your review your currently mapped studies to ensure they are assigned to the correct RPID.

I copied the NRDR’s email below in case you missed, or didn’t receive, it.   Please review and check your data.

Confirming Exam Name Mapping Accuracy

Checking your currently mapped study’s to ensure they are mapped correctly can be challenging and difficult.  For example,  the original facility we support began using the DIR in 2013 and currently has over 330 study names mapped to about 70 different RPIDs.  Suffice it to say some of the original studies were assigned incorrectly along the way.  The challenge they had was how to identify the incorrectly mapped studies? Looking at the list one at a time, and identifying studies that were incorrectly mapped, is tedious and time-consuming.

That is where Dose Registry Support Services was able to help – we developed a method of exporting and organizing all mapped studies in such a way that made it easy to grouped studies by RPID, which made it easier to identify mis-mapped studies.

Check back here soon.  Sometime in the next few days I plan to post step-by-step instructions describing the method we developed to easily identify incorrectly mapped studies.  We’ve used that approach for years and find it very useful.  I will likely post the announcement in a new blog post, but will put the instructions in an article on one of’s web pages.

As always, we are here to help.

Michael Bohl
Dose Registry Support Services

Need help getting more from your participation in the ACR’s Dose Index Registry® ?  Let Dose Registry Support Services tailor a program designed specifically to help your department succeed.  Contact Dose Registry Support Services to see how we can help.



Dear DIR Participant,

 We are preparing for the next aggregate report within the following week. Before we do so we would like for you to perform a Data Quality review of your own data. Please review the information below, follow the directions, and if you find errors in your data inform us right away. Failing to conduct a Data Quality review may result in a less accurate quarterly aggregate feedback report.


We highly encourage you to review your submitted data no less than every two months to monitor your data. We refer to this as a ‘data quality check.’ You may receive email notices from NRDR to remind you to perform a ‘data quality check’ but if you do not receive a notice, you are still held responsible to perform this critical step in preparation for your aggregate Feedback Report. To determine if we are receiving your data please log into the NRDR portal using your log-in credentials and password. After you have logged in, go to the DIR on the menu (left-hand side of the screen) and click to open the DIR Menu.

Comparing Volume of Exams Received vs. Volume of Exams Sent

  1. Under the DIR Report subheading, there are many reporting tools. To determine the last time your facility submitted exam data to the NRDR, use the ‘Summary of Data Submitted’ and ignore the date range fields, and click ‘Submit.’ The result of the search will display if we have received exam data from your facility recently.
  1. Clear the query to conduct a new one. Using ‘Summary of Data Submitted’ include one of the date ranges provided below to coordinate with the most recent reporting period. Make a note of the number of exams received during this reporting period.
    January 1 – March 31 (quarter Q1)
    January 1 – June 31 (semi-annual Q1Q2)
    July 1 – September 30 (quarterly Q3)
    October 1 – December 31 (semi-annual Q3Q4)

Compare the NRDR number of exams received to your volume of exams sent (your PACS may be able to help identify the number of exams sent).

  1. To know the volume of exams received per month, use the ‘Summary of Data Submitted’ and change the date range to capture one month at a time. Compare the NRDR monthly total of exams received against your monthly total number of exams submitted.

If the difference between the NRDR numbers and your total number of exams submitted is greater than 5% error, then please contact us at

Determine if Each CT Scanner is Sending Exam Data

  1. Under the DIR Report subheading, there are many report tools that can provide you with this data. An easy report tool to work with is ‘Dose Information by Exam.’ Click it to open the report page.
  2. Enter a date range of at least 3 months so that you can review what your CT scanners have been sending since the last aggregate report (which is issued quarterly). When the page populates go to the top of the screen and click ‘Export to Excel.’ Sort on the columns that contain information about your scanner, such as, Institution name, Scanner model and Scanner ID. In this manner, you can review the date that each of your scanners sent exam data to the NRDR. If any data is missing or if an entire CT scanner’s exam data is not appearing in the report, contact or call 703-390-9858 to troubleshoot data transmission.
  3. Also check the ‘Study Description’ column to affirm the names of your exams are being captured. If missing Study Descriptions for your exam names are greater than 5% error rate (per scanner) please use the email and phone number above to contact us.

Using the Standardize Dose Index Report Tool

The purpose of the Standardized DIR Report tool is to provide a user friendly reporting tool which can be searched by values not available in the other DIR reporting tools, a few of which have been mentioned above.

Map Exam Study Descriptions to a Radlex ID (RPID)

Exam names that have not been mapped to a RPID will not be included in the aggregate report. Please follow the instructions in the Exam Name Mapping User Guide to navigate through this task. If you have a Master-Child facility registration, you must perform your exam name mapping at the Master facility. Any mapping in the Child facility will be over written by the RPIDs selected in the Master facility every 24-hours. There are only a few Master-Child facility registrations that have ‘lifted’ restrictions, and are able to map at the Child facility. For the majority of Master-Child registrations, this is not the case. As a precaution, we suggest that you map at the Master facility level to avoid loosing your RPID mappings at a Child facility that may not have ‘lifted’ restrictions in place.

Please complete your quality data check and RPID exam name mapping within two weeks from the date of this email.

 Many best regards,

The NRDR Team

Posted on Leave a comment

Dose Reduction Case Study – Is Your CT Scanner Table Increasing Patient Dose?

During the 2017 Landauer Clinical Dose Optimization Symposium one of the session speakers, Douglas Pfeiffer, a medical physicist with Blackthorn Medical Physics, reported he had found the table-head rest extension connection mechanism increased CTDIvol and DLP when the body part being imaged was positioned over the connection mechanism.  Following the symposium, we worked with a facility for whom we provide Dose Index Registry Support and DoseID services to see if we could replicate Mr. Pfeiffer’s findings.  The results:  We found that scanning through the table-head rest extension connection mechanism increased doses by 28.5% to 31.7%.  In the following article, I describe how we performed the test, our findings, and changes we are making to reposition away from the connection mechanism when possible. 

Background:  This health system operates several scanners, two of which are Siemens Sensation 16 scanners.  The image below shows the head extension and the connection mechanism in the table end of one of the Sensation 16 scanners.  The black bar extending from the head rest inserts into the metal insert in the table end.  The facility uses this table configuration to perform CT Neck and CT Neck Angiograms, as well as to perform some extremity studies.   


Test Procedure:  We set the scanner to use the following scan parameters:  120kV; 217 Ref mAs; Auto-exposure set to On; 0.75 scan rotation time; with a 0.8 slice thickness.  We then imaged the 32cm phantom twice.  For the first scan we placed the 32cm phantom on the scanner table over table-head rest extension connection mechanism.  We positioned the phantom at isocenter and scanned through the visibly denser area in the table which represented the connection mechanism.  We noted the scan length.  We then repositioned the phantom so it was positioned on the table away from the table-head rest extension connection mechanism, again positioning the phantom at isocenter.  We rescanned the phantom using the exact same parameters and scan length. 

Results:  As the table below indicates, scanning the phantom when it was placed over the table-head rest extension connection mechanism increased CTDIvol by 28.5% and DLP by 31.7% when compared to scanning the phantom positioned on the table, but not over the connection mechanism.  Effective mAs (governed by auto-exposure) was 27.7% higher when the scan was performed through the table-head rest extension connection mechanism compared to scanning through the table top alone. 

Dose Diff Calcs

Discussion:  This confirms that some scanners’ head rest-table connection mechanism results in higher patient doses when the scanned anatomy is positioned over the connection mechanism than when the scan is performed through the table top and away from the connection mechanism.  We have not tested other system scanners so we can only report on what we found on the Siemens Sensation 16 scanner used to perform this test.  Also, depending upon the scan length, the overall dose increase during an actual study could be less than indicated here since our test scan was limited to the mechanism length while an actual scan may use a longer scanning range during which the scanner’s auto-exposure would compensate for the less dense areas above and below the mechanism. 

Summary:  We found that scanning through the table-headrest connection mechanism resulted in an increased CTDIvol and DLP 28.5% and 31.7%, respectively, compared to scanning through the table top itself (avoiding the connection mechanism altogether).  The facility is taking steps to change how they position patients relative to the head rest – table connection mechanism when possible to reduce patient dose.    


Need help getting more from your participation in the ACR’s Dose Index Registry® ?  Let Dose Registry Support Services tailor a program designed specifically to help your department succeed.  Contact Dose Registry Support Services to see how we can help.



Posted on Leave a comment

DoseID helps facilities identify duplicate and superfluous scanner protocols

One of the more interesting and unique benefits Dose Registry Support Services’ DoseID Program has for facilities is its ability to shed light on duplicate and superfluous protocols in use at their facilities.  DuplicatedProtocols

The table to the right shows how one facility had 7 different protocols being used on a single scanner during a recent 3 month period.  As you can see, all protocols are purportedly for a CT Abdomen/Pelvis without contrast exam, yet they are used with significantly varying frequency.  One is used by staff 714 (79%) of the 903 without contrast studies performed, yet 2 were used just under 40 times and 3 were used five or fewer times.  One has to question why these protocols exist and on what basis is staff choosing the infrequently used protocols.

This creates several issues, from hanging protocols in PACS to patient safety issues.  For example, when it is time to alter the protocol will whomever is updating the scanner parameters really change all 7 protocols?  This is important as facilities review their Dose reports and begin to make protocols changes to lower patient doses.

This is a widely recognized issue within the industry.  In its 2016-2017 development cycle Integrating the Healthcare Enterprise (IHE) is developing a new profile titled Enterprise Scanner Protocol Management to address this very issue.  Click here for more information on IHE’s initiative.

The good news is that because of the information contained in our reports, we are able to help them identify and eliminate duplicate and superfluous scanner protocols.

Need help getting more from your participation in the ACR’s Dose Index Registry® ?  Let Dose Registry Support Services tailor a program designed specifically to help your department succeed.  Contact Dose Registry Support Services to see how we can help.



Posted on Leave a comment

DRSS’s DoseID Service Now Includes Draft Policies For Client Use

DRSS is now providing clients with the following 8 draft policies it may adapt for use at their facilities:

  • [Facility Name] CT Dose Optimization Policy
  • [Facility Name] Quality Control and Maintenance Activities Policy
  • [Facility Name] Physicist Evaluation Policy
  • [Facility Name] Verification and Documentation of Medical Physicist Qualifications Policy
  • [Facility Name] Verification and Documentation of CT Technologists Qualifications Policy
  • [Facility Name] Process Prior to Conducting a Diagnostic Imaging Study
  • [Facility Name] Documentation of CT Radiation Dose Index Policy
  • [Facility Name] Policy for Reviewing and Analysis of the Dose Incidents Where the Dose Index Exceed its Expected Range



Posted on Leave a comment

36% of Joint Commission accredited facilities may be at risk of being cited for non-compliance w/ TJC’s Dose Incident Identification & Review mandate

In a recent poll 36% of facility administrators indicated they are using their CT scanners’ XR-29 dose notification and alert capabilities to meet The Joint Commission’s (TJC’s) Dose Incident Identification and Review requirements.  The reality is that a majority (and likely a vast majority) of these facilities are not in compliance with TJC’s mandate.

Hint:  TJC requires the facility to establish a dose range for every scan performed at that facility and compare the total dose imparted for each scan against its expected range.

Who can tell me why this likely poses a problem?

TJC Dose Incident ID Process Poll


Posted on Leave a comment

Take the poll: How are facilities meeting The Joint Commission’s Dose Incident Identification Requirement?

I’m taking a poll on how facilities are meeting The Joint Commission’s requirement to identify Dose Incidents.  Those wishing to respond may do so using the web response process described below.  I will share the results in an upcoming LinkedIn post.  This poll will be live through midday, Wednesday, July 27, 2016.  No identifying information is required.

Note:  This poll is for intended for facilities accredited by The Joint Commission to gauge how they are responding to this new requirement.

 Poll:  How does your facility identify studies whose total dose exceeded its expected range (Dose Incident) for review and analysis as required by The Joint Commission?

Answer choices:

A = We purchased a Third-party Dose Tracking Software solution
B = We use our scanner’s XR-29 Dose Notifications and Alert logs
C = We manually log doses at the end of each study and compare them to our  expected ranges
D = We aren’t doing anything yet
E = Other

Instructions for how to respond
Click on the link below
Select answer

Posted on 2 Comments

Joint Commission Confirms Facilities Need an Expected Dose Range for All CT Protocols; Note to Self: XR-29 Does NOT Equal TJC Compliance

The following question was submitted to the Joint Commission:

Must we establish an expected dose range for every imaging protocol or can we pick and choose the protocols for which we establish expected dose ranges and monitor only those? I interpret [PI.02.01.01.A.6.] to require monitoring of every protocol we use, not just some portion of them.

The Joint Commission response:

All protocols need to have an expected dose index range included. For protocols that are of similar anatomical areas, I have suggested that a general range (like the AAPM alert levels or ACR Pass/Fail levels) be used. As data is collected, analyzed and benchmarked then the expected dose index range can be refined.

Associate Director Standards Interpretation Group, Standards Interpretation Division of Healthcare Improvement Group The Joint Commission


Dose Registry Support Services Discussion: The reason we asked this question is to confirm my suspicion that XR-29, by itself, does not meet TJC’s dose incident identification requirement UNLESS facilities enter a dose range for every protocol they use, AND THEN have the ability to check the dose threshold UPON COMPLETION of the exam, which is much different than the XR-29 which often only test a subset of protocols and issues pre-scan alerts or notifications. Additionally, most XR-29 solutions do not establish a lower threshold in the system.  By definition, a “range” requires both an upper and lower threshold.  This was also confirmed with TJC during RSNA.  In summary, be very careful if you plan to depend solely on your scanner’s XR-29 capabilities to meet TJC’s Dose Incident Identification requirements.  It may not be set up to test doses on every protocol you use.

We have developed an effective, low cost solution for establishing both an upper and lower threshold dose range as well as identifying “dose Incidents” in which an exam exceeded the threshold.  The only requirement:  Participation in the ACR’s Dose Index Registry.  Contact us to learn more.



Posted on Leave a comment

CMS Final Rule Just Released: XR-29 Requirements are NOT Delayed

CMS will require CT scanners comply with the XR-29 standards as of January 1, 2016 as planned.  Procedures performed on non-compliant scanners will have their payment reduced by 5% in 2016 and 15% in 2017.  Facilities must apply modifier “CT” to all claims for procedures performed on non-compliant scanners.  The section below is a verbatim copy of the Final Rule XR-29 section.


Changes for Computed Tomography (CT) under the Protecting Access to Medicare Act of 2014 (PAMA)

Section 218(a)(1) of the Protecting Access to Medicare Act of 2014 (PAMA) (Pub. L. 113-93) amended section 1834 of the Act by establishing a new subsection 1834(p). Effective for services furnished on or after January 1, 2016, new section 1834(p) of the Act reduces payment for the technical component (TC) of applicable CT services paid under the Medicare PFS and applicable CT services paid under the OPPS (a 5-percent reduction in 2016 and a 15- percent reduction in 2017 and subsequent years). The applicable CT services are identified by HCPCS codes 70450 through 70498; 71250 through 71275; 72125 through 72133; 72191 through 72194; 73200 through 73206; 73700 through 73706; 74150 through 74178; 74261 through 74263; and 75571 through 75574 (and any succeeding codes). As specified in section 1834(p)(4) of the Act, the reduction applies for applicable services furnished using equipment that does not meet each of the attributes of the National Electrical Manufacturers Association (NEMA) Standard XR-29-2013, entitled “Standard Attributes on CT Equipment Related to Dose Optimization and Management.” Section 1834(p)(4) of the Act also specifies that the Secretary may apply successor standards through rulemaking.

Section 1834(p)(6)(A) of the Act requires that information be provided and attested to by a supplier and a hospital outpatient department that indicates whether an applicable CT service was furnished that was not consistent with the standard set forth in section 1834(p)(4) of the Act (currently the NEMA CT equipment standard) and that such information may be included on a claim and may be a modifier. Section 1834(p)(6)(A) of the Act also provides that such information must be verified, as appropriate, as part of the periodic accreditation of suppliers under section 1834(e) of the Act and hospitals under section 1865(a) of the Act. Section 218(a)(2) of the PAMA made a conforming amendment to section 1848 (c)(2)(B)(v) of the Act by adding a new subclause (VIII), which provides that, effective for fee schedules established CMS-1631-FC 166 beginning with 2016, reduced expenditures attributable to the application of the quality incentives for computed tomography under section 1834(p) of the Act shall not be taken into account for purposes of the budget neutrality calculation under the PFS.

To implement this provision, in the CY 2016 PFS proposed rule (80 FR 41716), we proposed to establish a new modifier to be used on claims that describes CT services furnished using equipment that does not meet each of the attributes of the NEMA Standard XR-29-2013. We proposed that, beginning January 1, 2016, hospitals and suppliers would be required to use this modifier on claims for CT scans described by any of the CPT codes identified in this section (and any successor codes) that are furnished on non-NEMA Standard XR-29-2013-compliant CT scans. We stated that the use of this proposed modifier would result in the applicable payment reduction for the CT service, as specified under section 1834(p) of the Act. We received the following comments on our proposal to require the modifier to be used on claims:

Many commenters endorsed the use of quality incentives to improve patient safety and optimize the use of radiation when providing CT diagnostic imaging services. Several commenters were supportive of the proposal to establish the modifier to identify CT services furnished using equipment that does not meet each of the attributes of the NEMA Standard XR- 29-2013.

Comment: Several commenters requested that we delay implementation of section 1834(p) of the Act so that they have additional time to comply before the payment reduction becomes effective.

Response: The statute requires that we apply the payment adjustment for computed tomography services furnished on or after January 1, 2016. Given this language, we believe that we must implement this provision beginning January 1, 2016. Therefore, we are not delaying implementation of this provision. We note that the payment reduction for 2016 is 5 percent, and it then increases to 15 percent in subsequent years. Hospitals and suppliers that furnish services CMS-1631-FC 167 that do not meet the equipment standard as of January 1, 2016, will receive this 5 percent payment reduction during 2016, but will have an opportunity to upgrade their CT scanners before the larger payment adjustment that takes effect beginning in CY 2017.

Comment: One commenter cited section 1834 (p)(4) of the Act, which specifies that through rulemaking, the Secretary may apply successor standards for CT equipment. The commenter indicated that CMS should develop successor standards that exempt CT scans performed on cone beam CT (CBCT) scanners that are FDA cleared only for imaging of the head from the requirement for Automatic Exposure Control (AEC) capability. This request was based on the AEC capability being unavailable on CBCT scanners.

Response: Although we agree with the commenter that the Secretary has authority to apply successor standards for CT equipment through notice and comment rulemaking, we would like to gain some experience with the NEMA Standard XR-29-2013 before adopting a successor standard. Therefore, we are not adopting a successor standard to the NEMA Standard XR-29- 2013 in this final rule with comment period, but may consider doing so in future rulemaking.

After consideration of the public comments we received, we are finalizing the establishment of new modifier, “CT.” This 2-digit modifier will be added to the HCPCS annual file as of January 1, 2016, with the label “CT,” and the long descriptor “Computed tomography services furnished using equipment that does not meet each of the attributes of the National Electrical Manufacturers Association (NEMA) XR-29-2013 standard”.

Beginning January 1, 2016, hospitals and suppliers will be required to report the modifier “CT” on claims for CT scans described by any of the CPT codes identified in this section (and any successor codes) that are furnished on non-NEMA Standard XR-29-2013-compliant CT scanners. The use of this modifier will result in the applicable payment reduction for the CT service, as specified under section 1834(p) of the Act.

2016 MPFS Final Rule

Twitter: @BohlM

Posted on Leave a comment

The Joint Commission’s Post Exam Incident Identification and Review Requirement

The Joint Commission (TJC) released its updated Diagnostic Imaging Requirements August 10, 2015. While many of these new requirements paired nicely with the new XR-29 equipment standards for dose recording and alerting the technologist when the system “projects” the prescribed scans will exceed the user-configured value (i.e., a pre-exam alert), the section titled Element of Performance for PI.02.01.01 A6. presents an interesting challenge to facilities.  It requires imaging facilities to:

  1. Review and analyze incidents where the radiation dose index from diagnostic CT examinations exceeds expected dose index ranges established for your protocols; and then,
  2. Compare these incidents to external benchmarks.

Essentially, #1 above is a “post-exam” process.  However, the new XR-29 standards do not provide this capability.  The challenge for facilities, therefore, is establishing an efficient process for identifying when the dose exceeded the expected range.  There are several third-party solutions that will help, but many facilities find them prohibitively expensive.  Fortunately, Dose Registry Support Services has developed an easy to use, low-cost solution based on the data each facility submits to the ACR’s Dose Index Registry that identifies every incident retrospectively at intervals the facility determines.

For more information visit or email