contact us

If you have questions or comments, feel free to connect with us using the form to the right.

One of our office staff will be happy to provide you with any information you might need.  

523 N Sam Houston Pkwy Suite 125
Houston, TX 77060
USA

(866) 675-6277

Professional Imaging is the largest provider of consultations for swallowing disorders in the United States.  

Our objective to provide the most timely and in-depth medical evaluation based on the patient's needs.   

We take pride in offering the most comprehensive, patient-centered, on-site evaluations available in the medical community.  

Each Professional Imaging clinic is staffed with a licensed physician, a certified speech language pathologist, and a driver technician whom aids in transportation of patients to and from facilities.  The physician compiles a complete medical history of each patient by performing extensive chart review, interview of family, staff, or patient as appropriate. Once a patient’s case history is completed the doctor then performs a focus-expanded physical exam of the patient along with the radiographic MBSS in conjunction with the speech language pathologist. Upon completion of the consultation, recommendations are made to the primary care physician, facility SLP, and facility nursing staff.  

Not only will we provide efficient delivery of high quality services, but we will also be dedicated to treating each and every patient with dignity and respect.   We continually strive to be the leader in service providers of specialized dysphagia evaluations.

 

2022 EHR RWT Results

 

Professional Imaging PI EMR v2.0

2022 Real World Testing Results

 

 

CHPL Product Number: 15.07.07.2835.PI01.01.01.1.221209 (current), 15.07.07.2835.PI01.01.00.1.190712 (previous)

Developer RWT Plan/Results Page URL: https://www.proimagetx.com/2022-ehr-real-world-testing-plan

Care Setting: Ambulatory Interaction with Nursing Facilities

PI EMR is a Self-Developed EHR

 

Summary of Testing Methods and Key Findings

 

Care Coordination

The following outlines testing methods and findings regarding criteria concerning Care Coordination (170.315(b)(1) and 170.315(b)(2)).

 

As regards (b)(1) Transitions of Care, a change was made to the original test method specified in the Testing Plan. Instead of providing a representative sample for a synthetic patient based on actual patient records, we generated a Continuity of Care CCDA document and a Referral Note CCDA document using one of our synthetic patients. This is because our practices do not make referrals and therefore do not generate a CCDA referral notes. The impact of this change is negligible as the test serves to fully demonstrate the ability of PI EMR to meet this criteria as well as the originally specified method. The CCDA xml files generated were successfully validated using the ONC C-CDA R2.1 Validator Tool at https://ett.healthit.gov/ett/#/validators/ccdar2.

 

For (b)(2) Clinical Information Reconciliation & Incorporation, we used a representative synthetic patient. The initial patient record was first imported. Then, an updated patient record containing new medications, problems and alerts was imported and was reconciled with the medications, problems and alerts in the initial patient record. Our practices have never received patient records in CCDA format from Nursing facilities. In order to simulate a referral from another practice, we generated the initial and updated patient records (CCDA files) at a practice at one location and then transmitted the CCDA files to a 2nd practice at a different location using a secure messaging system. Importing, reconciliation and incorporation of the initial and updated CCDA files was then performed at the 2nd practice. This test resulted in the data being successfully imported and applied to the target patient with no errors.

 

Clinical Quality Measures
The following outlines testing methods and findings regarding criteria concerning Clinical Quality Measures (170.315(c)(1),170.315(c)(2) and 170.315(c)(3)).

 

Changes were made to our testing plan for Clinical Quality Measures. Instead of analyzing the data logs of our CQM submissions for MIPs, synthetic patients were created by extracting redacted patient data for the reporting year. QRDA I and QRDA III files were generated and tested using into the Cypress test tool and a CVU+ test product. This is because planned testing was dependent on our electronic reporting of 2021 CQM data to CMS for MIPS. For this year, our practices were not required to report for MIPS. The impact of this change is negligible as the updated test serves to fully demonstrate the ability of PI EMR to meet this criteria as well as the originally specified method.

 

More than 200 synthetic patients were created by extracting patient data for the reporting year from PI patient database. Names, addresses and phone numbers were redacted to ensure patient data confidentiality was maintained.

 

A subset of these patients was exported for each measure in QRDA I format. The number of patients with the same numerator, denominator and denominator exclusion/exception values was limited so that adequate testing could be performed without having more than 40 patients per measure.

 

The synthetic patient QRDA I files were uploaded as vendor patients was created for each measure being testing. The process of uploading the vendor patients is the (c)(1) record and export test. Successfully uploading the QRDA I files using the Cypress vendor patient import function ensured that PI EMR generated RWT QRDA I files were recorded and exported with no errors.

 

Once vendor patients were uploaded and the CVU+ product was created, the test deck for the CVU+ product was downloaded and imported into the PI EMR "Tools / Meaningful Use Commands / Quality/MU Measure Data..." function. Measure calculations were then performed and a QRDA III summary file was created for the measure.

 

Finally, the PI EMR generated QRDA III file was uploaded to the Cypress test tool EP/EC Measures Test page for validation. Successful validation of the QRDA III files indicated that the (c)(2) and (c)(3) functionality (import, calculate and submit) are performing successfully with no errors.

 

Patient Engagement

The following outlines testing methods and results for the certification criteria concerning Patient Engagement (170.315(e)(1)).

 

We used analysis of use logs for our Patient Portals to provide (e)(1) RWT results for viewing and downloading of patient records. Instead of using new accounts for a 3 month period, we included all new accounts generated in 2022 up to 5/20/2022. Regarding transmission of patient records, we provide printed copies to the Nursing Facilities. This negates the demand for electronic transmission of patient records. As a result, we have used a representative sample of synthetic patient records to show that we do have compliant functionality for transmitting patient records.

 

The test results for patient records accessed from the patient portal as of 5/20/2022 were from a total of 9704 documents viewed. No failures to were recorded. The results for downloading test results for patient records accessed from portal as of 5/20/2022 showed, out of 2336 documents downloaded, no failures were detected

 

The transmit test results using CCDA test documents for 5 representative synthetic patients were validated using the ONC C-CDA R2.1 Validator Tool at https://ett.healthit.gov/ett/#/validators/ccdar2 and showed no failures.

 

Public Health

The following outlines the test methods and results for the certification criteria concerning Transmission to Public Health Agencies (170.315(f)(2) and 170.315(f)(7)).

 

For (f)(2), 5 NIST HL7v2 Syndromic Surveillance xml files were generated by PI EMR as a representative sample of synthetic patient data based on our actual patient cases. To verify the completeness and accuracy of the test files, the NIST HL7v2 Syndromic Surveillance validation tool at https://hl7v2-ss-r2-testing.nist.gov/ss-r2/#/home was used and showed no errors.

 

For (f)(7), 5 NHCS xml survey files were generated by PI EMR as a representative sample of synthetic patient data based on our actual patient cases. To verify the completeness and accuracy of these test files, we used the NHCS validation at https://cda-validation.nist.gov/cda-validation/muNHCS12.html was used and showed no errors. 

 

Application Programming Interfaces

The following outlines the testing methods and results for the certification criteria concerning APIs (170.315(g)(7), 170.315(g)(8) and 170.315(g)(9)).

 

Criteria (g)(7), (g)(8) & (g)(9) were tested using the in-house developed application PI_EHR_API_Test.exe that demonstrates use of our certified EHR API. This is because since its implementation, we have received no requests for information through our API.

 

For (g)(7), an encrypted API key was successfully generated that allows access to a specific representative synthetic patient.

 

Testing of (g)(8) was comprised of downloading of all required data item categories for the aforementioned synthetic patient and then comparing a SHA2 256 bit hash digest sent with each data item to a SHA2 256 bit hash digest generated for the received item. All received data items were successfully validated by comparing the generated SHA2 256 bit hash digests.

 

Testing of (g)(9) was performed by downloading a CCDA document for the aforementioned synthetic patient. This file was validated using the ONC C-CDA R2.1 Validator Tool at https://ett.healthit.gov/ett/#/validators/ccdar2 and no errors were found.

 

Electronic Exchange

The following outlines the testing methods and results for the certification criteria concerning Electronic Exchange (170.315(h)(1)).

 

A synthetic test patient was looked up in PI EMR and then exported and sent in both human readable and CCDA R2 format (as a zip file) in an attachment to a test user on our HISP provider's (MaxMD) provider portal. The attachment was received from PI EMR on the HISP portal was a copy of the zip file that was sent from PI EMR and received by the test user on our HISP provider portal.

 

The received zip file was then downloaded by the test user and renamed from

RWT_PT_PATIENT_06_Pt_Smy(2022-06-30).zip   to   RWT_PT_PATIENT_06_Pt_Smy(2022-06-30) - Reply.zip

 

The test user then replied to the original Direct message and sent the renamed zip file as an attachment. The PI EMR Messaging function was then used to receive the reply and download the attachment.

 

The contents of the original zip file sent by PI EMR were electronically compared to the contents of the downloaded attachment from and were confirmed to contain exactly the same data. The CCDA R2 xml file in the zip file attachment was successfully validated using the ONC C-CDA R2.1 Validator Tool at https://ett.healthit.gov/ett/#/validators/ccdar2.

 

Key Milestones

Communication with our affiliate practice to schedule participation was conducted as necessary via phone conversation throughout the first 3 quarters of 2022.

 

Dates testing was performed were as follows:

· (b)(1) and (b)(2) – July 2/3, 2022

· (c)(1), (c)(2) and (c)(3) – September 13, 2022

· (e)(1) – June 27 to 30, 2022

· (f)(2) and (f)(7) – June 25 to 30, 2022

· (g)(7), (g)(8) and (g)(9) – June 27 to July 3, 2022

· (h)(1) – July 21, 2022

 

Our CY 2023 real-world test plan was completed and submitted to our ONC-ACB on October 16, 2022

 

All test documents associated with this report are available upon request.

 

Overall Summary

The testing methods employed, and necessary changes made from those planned are specified in the report sections for each category of functionality tested. The reported results of this testing effectively verify that the certified functionality tested complies with the criteria requirements as no errors were encountered.

 

Standards Updates

There were no updates to the certified criteria during the period in which testing was conducted.

Authorized Representative Signature:

 

Date: January 4th, 2023