Cloud Computing-Assisted Dose Verification System and Method for Tumor Pain Treatment

At present, modern medical data processing systems based on cloud computing, as a very important part of the auxiliary tumor pain treatment program, have played a key role in the soft services of the entire tumor pain treatment program. This article aims at the radiation dose verification problem of tumor patients in tumor treatment. The modern medical data processing system constructed by cloud computing technology is used to solve this kind of problem. The method used in this article is to select 100 tumor patients and use cloud computing-based oncology medical data processing system to analyze patients of different ages and body types, common irradiation sites and irradiation angles, and other irradiation parameters, and calculate using Monte Carlo method. Interventional treatment of the patient’s various organs and tissues and the combined results into a large database. The experimental results show that the construction and application of a tumor medical data processing system based on cloud computing can better combine the individual condition and disease characteristics of tumor patients, give an accurate radiation dose for tumor pain treatment, and reduce the incidence of risk events. Compared with the experimental and simulation data of related literature, the results show that the cloud computing-based tumor medical data processing system proposed in this paper has improved the standardization of tumor pain treatment by nearly 15% on average, and the treatment plan satisfaction has increased by nearly 20%.


I. INTRODUCTION
The medical industry is a special industry that is closely related to the people's livelihood and closely related to people's lives. With the introduction of information technology in the medical industry, the informationization and automation of the medical industry have been continuously improved. However, no matter how informatized, the core of the medical The associate editor coordinating the review of this manuscript and approving it for publication was Wei Wei . industry is medical data. According to statistics, by 2020, medical data will increase sharply to 35ZB (1ZB = 230TB), which is equivalent to 44 times the amount of data in 2009. Massive medical data and complicated data types will bring huge storage and processing to the entire medical industry pressure. At the same time, people are paying more and more attention to medical data. How to effectively store and process large amounts of medical data and provide efficient data services and data support for doctors and patients has become an urgent problem. VOLUME 8, 2020 This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/ In recent years, the rapid development of cloud computing has provided an important way to solve the above problems [1]- [3]. The high-performance services provided by the cloud platform provide new methods for data storage, retrieval, processing, and analysis, which can further improve the performance and efficiency of data processing. Effectively storing and processing massive medical data through cloud computing platforms, and analyzing tumor patients of different ages and body types, common irradiation sites and irradiation angles, and other irradiation parameters have become a very meaningful research topic. Today, with the continuous development of IT technologies such as cloud computing, traditional information technology is no longer suitable for existing medical applications. The medical industry must go further and establish a medical environment that specifically meets the security and availability requirements of the medical industry-the ''medical cloud''. Medical cloud is a system that adopts modern computing technology in the field of medical care and uses the concept of ''cloud computing'' to build healthcare services [4], [5]. This healthcare service system can effectively improve the quality of healthcare, control costs and provide convenient access to healthcare services.
In China, the use of cloud computing technology in the medical industry has been widely promoted. Zhang Xiaojun et al. Proposed a wireless medical sensor network (WMSN) to improve medical quality [6]. Yuan Y et al. Proposed a cloud-based service system architecture for sharing large medical DICOM image data sets in cloud computing in order to manage and share large and challenging DICOM images [7]. Xu Bo and others believe that the background technology clinical decision support system can effectively break the limitations of doctors' knowledge, reduce the possibility of misdiagnosis, and thus improve the level of medical care [8]. In view of the continuous development of the current medical cloud platform, Shi et al. Proposed a model based on analytic hierarchy integration based on RBF network [9]. Sood SK et al. Proposed a new system based on IoT sensors, cloud computing and fog computing to distinguish, classify and monitor users infected with MBD [10].
Based on the above background, this paper solves the problem of radiation dose verification for tumor patients in tumor treatment. Modern medical data processing systems built using cloud computing technology are used to solve such problems. The method used in this paper is to select 100 tumor patients and use cloud-based oncology medical data processing system to analyze patients of different ages and body types, common irradiation sites and irradiation angles, and other irradiation parameters. The experimental results show that the construction and application of oncology medical data processing system based on cloud computing can better integrate the individual conditions and disease characteristics of cancer patients. Provide accurate radiation dose for tumor pain treatment, reduce the risk of risky events.

II. PROPOSED METHOD A. CLOUD COMPUTING 1) CLOUD COMPUTING AND ITS AUXILIARY ROLE IN MODERN MEDICINE
Cloud computing is an emerging method of sharing infrastructure [3], [11], [12]. It uniformly manages a large number of physical resources and virtualizes these resources to form a huge virtualized resource pool. The cloud is a type of parallel and distributed system that consists of a series of interconnected virtual computers. These virtual computers are dynamically deployed based on service level agreements (negotiated between suppliers and consumers) and exist as one or more unified computing resources. Cloud computing can dynamically deploy virtual resources according to users' requirements for resources and computing power, without being limited by physical resources. All cloud-based computing and applications of users work on virtualized resources. They do not need to care about which physical resources these resources are deployed on. Users can easily change the demand for computing resources.
The core technologies involved in modern medical treatment are technologies such as sense, communications, information, and security. The main research direction of this research is the application of information technology in cloud computing technology among the technologies covered by modern medical treatment. In terms of modern medical data storage, cloud computing-based storage technology provides a reliable guarantee for the storage of massive medical data such as patient vital signs data, medical pictures, and medical videos. Modern medical cloud platform management technology based on cloud computing includes data management, resource management and deployment, and data calculation. Traditional data management computing platforms do not make full use of resources, resulting in a huge waste of resources. Therefore, virtual resource allocation and management based on virtual technology enables efficient use of resources, which greatly reduces the pressure of using physical methods and accessing data. With regard to the operation support management platform of modern medical care, the patient (customer) information model is established through cloud computing technology to give a portrait of the patient so as to better understand the patient. At the same time, cloud computing provides technical support for the implementation of medical monitoring marketing, accounting and billing functions under multiple medical services. Order relationship management, terminal message interconnection and routing and forwarding functions are also solved in the cloud computing environment.
Modern medical systems based on cloud computing provide autonomous computing technology, but when the system fails, it is able to autonomously detect system faults, locate the source of the fault, and then isolate the fault. Finally, it repairs the fault and greatly improves it System reliability [13], [14]. This highly intelligent and highly automatic system computing capability ensures that users can always access the services they need, and realizes the functions of self-detection, self-location and self-healing of the system.

2) FEATURES OF CLOUD COMPUTING
Cloud computing includes six characteristics, as shown in Fig. 2. 1) Virtualization technology: Virtualization technology is the most emphasized feature of cloud computing, including resource virtualization and application virtualization. The environment in which each application is deployed has nothing to do with the physical platform. The management through the virtual platform achieves application expansion, migration, and backup, and operations are completed through the virtualization layer.
2) Dynamic scalability: The purpose of expanding applications is to dynamically expand the level of virtualization. Servers can be added to existing server farms in real time to increase the ''cloud'' computing power.
3) On-demand deployment: Users running different applications require different resources and computing power. Cloud computing platforms can deploy resources and computing capabilities according to user needs. 4) High flexibility: Most of the software and hardware now have certain support for virtualization. Various IT resources, such as software, hardware, operating systems, storage networks, etc., are placed in cloud computing virtual resource pools through virtualization. Perform unified management. At the same time, it can be compatible with products of different hardware manufacturers, compatible with low-profile machines and peripherals to obtain high-performance computing. 5) High reliability: Virtualization technology allows users' applications and calculations to be distributed on different physical servers. Even if a single point of server crashes, new servers can still be deployed as resources and computing capabilities by dynamically expanding functions to ensure applications and computing calculated normal operation. 6) High cost performance: Cloud computing uses a virtual resource pool method to manage all resources, and requires less physical resources. You can use cheap PCs to form a cloud, but the computing performance can exceed the mainframe.

3) CLASSIFICATION OF CLOUD COMPUTING a: BY SERVICE MODEL
A Infrastructure as a Service Iaa S (Infrastructure as a Service). The service provided to consumers is the utilization of all facilities, and users can deploy and run arbitrary software, including operating systems and applications. such as the cloud computing center of Wuxi Software Park, Amazon EC2, IBM Blue Cloud, etc.
B.Platform as a Service PaaS (Platform as a Service). The service provided to consumers is to deploy customer applications to the vendor's cloud computing infrastructure. Such as Saleforce's Force.com, Google App Engine, Microsoft's Windows Azure Platform, etc.
C Software as a Service (SaaS). The services provided to customers are operators' applications running on cloud computing infrastructure, which users can access through client interfaces on various devices. Consumers do not need to manage or control any cloud computing infrastructure. Such as IBM's LotusLive, AdventNet's Zoho online office suite.

b: ACCORDING TO THE AFFILIATION OF CLOUD COMPUTING PROVIDERS AND USERS
A Public Cloud. Public cloud is a cloud environment shared by several enterprises and users. Users use the pay-as-you-go model to obtain the required IT resource services. At present, the typical public clouds are: Google App Engine, Amazon EC2, IBM Developer Cloud, and Wuxi Cloud Computing Center.
B Private Cloud. A private cloud is a cloud environment that is independently built and used by an enterprise. In a private cloud, internal members share all the resources provided by the cloud computing environment, and users outside the company or organization cannot access the services provided by this cloud computing environment. China's ''Sinhua Cloud Computing'' is a typical private cloud instance.
C Hybrid Cloud. Refers to a mix of public and private clouds. The iTricity Cloud Computing Center in the Netherlands is a hybrid cloud with five data centers developed on the basis of IBM's Blue Cloud technology, providing cloud services to multiple countries. With the vigorous development of Ubiquitous Network, this ubiquitous wireless network environment provides infrastructure support for the realization and popularization of modern medical treatment. Modern medicine is a wireless network composed of physical data collection nodes on the human body and biosensor nodes in the human body. It has great application significance and needs in telemedicine and home domain health systems, and is becoming a strategy in the medical and communications industry. Research directions for sexual research [14], [15]. In the ubiquitous wireless environment, modern medical systems based on cloud computing have been able to collect people's physical characteristics at any time and any place, and use cloud storage and parallel calculation of massive physical sign data to detect in real time. People's health is provided to the people in a timely and accurate medical service. Modern medical architecture based on cloud computing is divided into four layers: sensor node layer, personal node gateway layer, log node data layer and cloud node cloud computing layer. The architecture diagram of a modern medical architecture based on cloud computing is shown in Fig.4.
The foundation of a modern medical framework based on cloud computing is the implementation based on IEEE11073, because the research of sensors and protocols based on IEEE11073 is the cornerstone of the entire architecture and the sensor node layer. The core of this layer is to collect heterogeneous data of different brands and different types of monitoring terminals through sensor technology, and use the unified format of the IEEE11073 protocol to lay out the uniformity of the data, which greatly improves the efficiency of data post-processing and reduces data processing. The two layers of the personal node gateway layer and the log node data layer of modern medical architecture are closely related to the person. These two layers are the generation layer of physical sign data and the definition layer of the physical sign data format. This layer affects the next cloud computing layer. The design of the overall architecture is also an important guarantee for the data quality of the computing layer. The network computing layer of this research is a modern medical data processing system that adopts cloud architecture. This system includes data collection, data cleaning, data storage, and real-time data processing. It faces the challenge of a large number of physical data, and big data in all aspects. Storage and analysis will be the key to this research. Finally, the data access layer based on the distributed cache framework is also the difficulty and focus of this research.

2) MODERN MEDICAL DATA PROCESSING SYSTEM BASED ON CLOUD COMPUTING
Hadoop is a widely used distributed system, and it is the basic skeleton in the existing cloud computing ecosystem [16]- [18]. The cluster has high-speed parallel computing performance and large data storage capacity of terabyte-level data storage. The basic component of the cluster is to use HDFS (distributed file system) as a distributed storage framework, and its upper layer is the MapReduce (distributed computing framework) calculation engine. HBase [19], [20] is a columnoriented, highly-expanded, distributed Key-Value storage system, unlike traditional databases. It has the characteristics of stability, reliability, and outstanding performance. Hive plays the role of data creation in the Hadoop ecosystem. It can store massive historical data and provide complete SQL-like statements to provide user analysis and query. Storm (distributed real-time computing framework) is a distributed, fault-tolerant real-time computing system. Storm also known as real-time Hadoop, is an efficient real-time processing system. Storm's high reliability performance is that every message submitted to the system will be processed, and millions of messages can be processed per second, which fully reflects the high throughput of the system [21]. More conveniently, the system supports the use of any programming language for development, so that the user's development work becomes within reach. The distributed messaging system Kafka is a high-throughput distributed publish-subscribe messaging system. It can stably store messages for a long time without any loss. It can reach the number of transactions per second even with very cheap hardware. High throughput of 100,000 messages. The distributed message system Kafka supports partitioning messages through the publisher and consumer, and can load massive amounts of data stored in Hadoop into the system efficiently and in parallel through corresponding APIs. Zookeeper (distributed cluster coordination framework) is used for distributed coordination of Hadoop. Many frameworks in the Hadoop ecosystem use Zookeeper for distributed coordination, which can efficiently coordinate and manage large-scale clusters. Sqoop, as a data synchronization tool in the Hadoop ecosystem, has the ability of large-scale data synchronization and efficient performance. Combined with the traditional relational database, the Redis cache database is used to store the multi-structured data. The specific data structure uses the appropriate storage format strategy. Moreover, the distributed cache framework provides efficient performance for the storage of a large amount of heterogeneous data. And highly reliable storage access. The framework of the entire Hadoop ecosystem lays a solid foundation for the entire framework of the system. The modern medical cloud application architecture is shown in Fig 5. Physiological parameter acquisition part: Due to the requirements for data storage efficiency, the traditional B + tree-shaped disk data storage method with an access cost of O (lg n) will not be suitable for this system. This system will seek a more efficient storage strategy to achieve O (1) disk storage cost. Therefore, this research proposes a data collection system for modern medical architecture. The Publish-Subscribe distributed information collection system can support both synchronous acquisition and asynchronous subscription. Through client connection reuse, it is closer to JMS and Notify.
Data storage and analysis: In terms of storage, the HDFS distributed file system with high fault tolerance and high transmission rate is planned to provide high-capacity, high-stability SQL-like data warehouse Hive. In terms of data analysis, Spark, an iterative MapReduce data analysis system, is planned to use memory as an intermediate data storage place, reduce I / O times, increase intermediate data writing and reading speed, reduce time delay, and speed up the data mining process. And use machine learning tool Mahout to improve the quality of mining data. At the same time, the distributed fault-tolerant real-time computing system Storm is optimized to optimize Hive and Impala Query Planner. Cluster performance monitoring and MapReduce task management, using Chukwa for Hadoop log collection and analysis, and a graphical interface to display cluster disk IO, memory usage, and cluster storage space usage trends, enhancing the system's performance monitoring capabilities. A workflow scheduling (Oozie) framework is used to schedule Hadoop MapReduce tasks for execution. Oozie workflows are a set of tasks placed in DAGs (directed acyclic graphs). Oozie automatically schedules MapReduce tasks based on specified job dependencies.
Data display layer part: In the modern medical application layer architecture, it provides powerful storage and query capabilities for data access in the data display layer by using a distributed cache framework. This distributed cache framework has the characteristics of high redundancy, high reliability, and mass storage, and can run normally and stably under the environment of millions of parallel accesses, and accurately and quickly return the results of user queries. VOLUME 8, 2020 For better system monitoring, a simple management interface is also a feature of this framework.

III. EXPERIMENTS A. EXPERIMENTAL DATA SET
The experimental data was used to verify the accuracy of the cloud computing-based tumor medical data processing system. 100 glioma patients were selected and divided into a control group and an observation group by a random number table method, with 50 patients in each group. In the control group, there were 30 males and 20 females; aged 35 to 66 years, with an average age of (47.23 ± 2.55) years; 24 cases had tumors on the right frontal lobe, 16 cases on the right temporal lobe, and 5 cases on the right parietal occipital lobe. There were 4 cases in the left frontal lobe. The diameter of the tumor was between 2 and 6 cm, with an average tumor diameter of (4.23 ± 0.32) cm. The course of the disease was 10 to 283 d, with an average course of (167.31 ± 23.89) d. The traditional surgical safety inspection method was used. In the observation group, there were 30 males and 20 females; aged 37 to 67 years, with an average age of (48.45 ± 2.83) years; 22 cases had tumors located on the right frontal lobe, 19 cases on the right temporal lobe, and 4 on the right parietal Cases, 4 cases in the left frontal lobe; tumor diameter between 2 and 7 cm, average tumor diameter (4.18 ± 0.28) cm; course of disease between 12 and 276 days, mean disease duration (160.29 ± 21.76) days; cloud-based tumor medical treatment The data processing system performs a surgical safety check.

B. EXPERIMENTAL PROCESS
When the experimental method cannot measure the exposure dose of the internal organs of the human body, computer simulation can be used for calculation. The Monte Carlo method has high accuracy in calculating particle transport problems. Therefore, this study uses the Monte Carlo program MCNPX to simulate the interventional radiation process. MCNPX is a large Monte Carlo program developed by Los Alamos National Laboratory in the United States. It can calculate the transport of particles such as neutrons, photons, and electrons in complex geometric structures. In addition, in order to simulate the patient's exposure, a variety of computer human models were used in this study. These models include important organs in radiation protection. Therefore, Monte Carlo programs can be used to calculate the radiation dose of each organ.
Monte Carlo method and cloud computing-based tumor medical data processing system to aid tumor treatment process simulation: For the interventional treatment process, six more commonly used radiation directions were selected, from the rear (anterior exterior) (PA) and the left rear (1eftanterioroblique)., LAO), rightanterioroblique (RAO), lower rear (cranial, CRAN), left (1eftlateral, LLAT) and right (ralatera, RLAT) irradiation, as shown in Fig 6. This article simulates various parameters of the X-ray machine according to different types of patients and different irradiation sites. The tube voltage is set to 55, 60, 70, 80, 90, 100, 110, and 120 kVp, and filtered. The thickness of the copper sheet is 0, 0.1, 0.2, and 0.311 lm, and the thickness of the filter aluminum sheet is a fixed value of 3.5 mm. In addition, depending on the irradiation field of view, the ray field of view on the receiving screen will have different sizes. Here, square irradiation fields with side lengths of 10, 20, 30, and 40 cm were simulated. The x-ray dose rate used in the software was measured using a spherical air ionization chamber with a sensitive volume of 0.6 cm at the center of the irradiation field at 4 ClTI above the x-ray tube. In Monte Carlo simulation, a spherical detector is simulated at the same position above the x-ray tube, with a sensitive volume of 0.6 cm. Since the dose calculated by MCNPX is normalized to one photon, a corresponding conversion is required to calculate the actual organ dose. This conversion relationship is shown in formula (1):

A. COMPARISON OF OPERATING BEHAVIOR STANDARDS
The handover time of patients in the observation group was (7.68 ± 1.65) min, and the handover time of the control group was (13.45 ± 2.89) min. The handover time of the observation group was shorter than that of the control group. < 0.05=); The observation group's compliance rate for surgical safety inspection, the qualification rate for writing surgical instruments, and the use of standardized antibiotics were higher than those in the control group, and the differences between the two groups were statistically significant (x2 = 5.94, x2 = 6.11, X2 = 8.12; P <0.05), see Table 1. Figure 7 visually shows the comparison of the operating behavior norms of medical staff in the two verification methods. Figure 7 compares the operational behavior norms of medical staff in cloud-based tumor medical data processing system-assisted tumor treatment and traditional medical procedures from three aspects. It can be seen that cloud-based tumor medical data processing system-assisted tumor treatment programs can effectively regulate the operating   behavior of medical staff. The scheme proposed in this paper has a promoting effect on the operating behavior standards of medical personnel, and compared with the traditional scheme, the standardization degree has increased by nearly 15% on average. Table 2 shows the satisfaction surveys of medical staff with different verification methods. Figure 6 shows a comparison of the satisfaction of medical staff in the two verification methods. Fig 8 surveys medical staff's satisfaction with cloud-based tumor medical data processing system-assisted tumor treatment and traditional medical treatment from three levels. It can be seen that 90% of medical staff have a very high evaluation of cloud-based tumor medical data processing system-assisted tumor treatment programs. Compared with traditional programs, the cloud-based tumor medical data-processing system-assisted tumor treatment program proposed in this paper Satisfaction has increased by nearly 20%.   it can be intuitively seen that the cloud-based tumor medical data processing system-assisted tumor treatment scheme proposed in this paper can effectively reduce the incidence of common risk events. Fig 10 shows a comparison of the average incidence of risk events within 10 hours after the two verification methods. It can be seen from the figure that although the risk event rate of the control group and the observation group can be reduced to less than one thousandth after 10 hours, it is clear that the cloud computing-based auxiliary tumor treatment method proposed in this article has a significant effect on the risk event rate The reduction is more pronounced and effective.

B. RISK EVENT AND SCORE COMPARISON
The operating room, as a more important platform department, focuses on surgical treatment and rescue of critically ill patients. In general, it can be used as the core decision-making department to ensure the safety of patients' surgery and provide patients with high-quality and comprehensive perioperative intervention. The quality of medical care and its safety are one of the most fundamental conditions for hospital operation. In the current era of big data, digital management has become the most basic development direction of operating room construction, and it has also become an important part of the construction of digital hospitals. Combining with previous research data, it can be known that according to the current national technical standards for diagnosis and treatment and the requirements of surgical safety goals, safety issues not only focus on management during surgery, but also involve all aspects of the perioperative period. Therefore, control and management needs A lot of information and data support. The cloud computing operating room mobile information system implemented in this study is based on the special needs of glioma patients and extends the wireless network to the operating room. It connects to the operating room anesthesiologist workstation, hospital electronic medical record system, and supplies through the port. Connections such as the circulation management system, to achieve multi-system interactive support and transfer of information between doctors and nurses, and provide guarantee for the safety management of the operating room.

V. CONCLUSIONS
At present, digital management has become the most basic development direction of medical construction, and it has also become an important part of digital hospital construction. Combining with previous research data, it can be known that according to the current national technical standards for diagnosis and treatment and the requirements of surgical safety goals, safety issues not only focus on management during surgery, but also involve all aspects of the perioperative period. Therefore, control and management needs A lot of information and data support. The cloud computing-based tumor medical data processing system implemented in this study assists the tumor treatment process as a basis, extending the wireless network to the operating room, which through the port and the operating room anesthesiologist workstation, hospital electronic medical record system, and article consumables circulation management system And other connections to achieve multi-system interactive support and call information between doctors and nurses, to provide security for the operating room safety management.
By adopting a cloud-based virtualization solution, all medical management systems are sequentially migrated to the cloud data center, which has a significant impact on the efficiency of IT maintenance staff, the utilization of hospital infrastructure, the continuity of medical system operations, and data security. The increase in the amplitude has greatly reduced the hospital's investment costs and management costs, which can easily deal with the problems of multiple classification of information systems in the medical industry, high business continuity requirements, lack of system maintenance time, and short troubleshooting time requirements. The use of the characteristics and advantages of cloud computing and the reasonable establishment of a medical cloud data center in accordance with the situation of each hospital are in line with the future development trend of medical informatization, and are the only way to lead the informatization of hospitals to a deeper level.
The results of this study indicate that the construction and application of a cloud-based tumor medical data processing system can better combine the individual disease and disease characteristics of tumor patients, give accurate radiation doses for tumor pain treatment, and reduce the incidence of risk events. Compared with the experimental and simulation data of related literatures, the cloud computing-based tumor medical data processing system proposed in this paper has improved the standardization of tumor pain treatment by an average of nearly 15% and the treatment plan satisfaction by nearly 20%.