<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xml:lang="EN" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="review-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Med.</journal-id>
<journal-title>Frontiers in Medicine</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Med.</abbrev-journal-title>
<issn pub-type="epub">2296-858X</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fmed.2022.840498</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Medicine</subject>
<subj-group>
<subject>Review</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Applications of Artificial Intelligence in Myopia: Current and Future Directions</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name><surname>Zhang</surname> <given-names>Chenchen</given-names></name>
<uri xlink:href="http://loop.frontiersin.org/people/1606525/overview"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Zhao</surname> <given-names>Jing</given-names></name>
<uri xlink:href="http://loop.frontiersin.org/people/1695853/overview"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Zhu</surname> <given-names>Zhe</given-names></name>
<uri xlink:href="http://loop.frontiersin.org/people/1695228/overview"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Li</surname> <given-names>Yanxia</given-names></name>
<uri xlink:href="http://loop.frontiersin.org/people/1670228/overview"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Li</surname> <given-names>Ke</given-names></name>
<uri xlink:href="http://loop.frontiersin.org/people/1696427/overview"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Wang</surname> <given-names>Yuanping</given-names></name>
<uri xlink:href="http://loop.frontiersin.org/people/1695249/overview"/>
</contrib>
<contrib contrib-type="author" corresp="yes">
<name><surname>Zheng</surname> <given-names>Yajuan</given-names></name>
<xref ref-type="corresp" rid="c001"><sup>&#x0002A;</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/1695855/overview"/>
</contrib>
</contrib-group>
<aff><institution>Department of Ophthalmology, The Second Hospital of Jilin University</institution>, <addr-line>Changchun</addr-line>, <country>China</country></aff>
<author-notes>
<fn fn-type="edited-by"><p>Edited by: Tae-Im Kim, Yonsei University, South Korea</p></fn>
<fn fn-type="edited-by"><p>Reviewed by: Beatrice Gallo, Epsom and St. Helier University Hospitals NHS Trust, United Kingdom; Chih-Chien Hsu, Taipei Veterans General Hospital, Taiwan</p></fn>
<corresp id="c001">&#x0002A;Correspondence: Yajuan Zheng  <email>zhengyajuan124&#x00040;126.com</email></corresp>
<fn fn-type="other" id="fn001"><p>This article was submitted to Ophthalmology, a section of the journal Frontiers in Medicine</p></fn></author-notes>
<pub-date pub-type="epub">
<day>11</day>
<month>03</month>
<year>2022</year>
</pub-date>
<pub-date pub-type="collection">
<year>2022</year>
</pub-date>
<volume>9</volume>
<elocation-id>840498</elocation-id>
<history>
<date date-type="received">
<day>21</day>
<month>12</month>
<year>2021</year>
</date>
<date date-type="accepted">
<day>15</day>
<month>02</month>
<year>2022</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x000A9; 2022 Zhang, Zhao, Zhu, Li, Li, Wang and Zheng.</copyright-statement>
<copyright-year>2022</copyright-year>
<copyright-holder>Zhang, Zhao, Zhu, Li, Li, Wang and Zheng</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/"><p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p></license> </permissions>
<abstract>
<p>With the continuous development of computer technology, big data acquisition and imaging methods, the application of artificial intelligence (AI) in medical fields is expanding. The use of machine learning and deep learning in the diagnosis and treatment of ophthalmic diseases is becoming more widespread. As one of the main causes of visual impairment, myopia has a high global prevalence. Early screening or diagnosis of myopia, combined with other effective therapeutic interventions, is very important to maintain a patient&#x00027;s visual function and quality of life. Through the training of fundus photography, optical coherence tomography, and slit lamp images and through platforms provided by telemedicine, AI shows great application potential in the detection, diagnosis, progression prediction and treatment of myopia. In addition, AI models and wearable devices based on other forms of data also perform well in the behavioral intervention of myopia patients. Admittedly, there are still some challenges in the practical application of AI in myopia, such as the standardization of datasets; acceptance attitudes of users; and ethical, legal and regulatory issues. This paper reviews the clinical application status, potential challenges and future directions of AI in myopia and proposes that the establishment of an AI-integrated telemedicine platform will be a new direction for myopia management in the post-COVID-19 period.</p></abstract>
<kwd-group>
<kwd>artificial intelligence</kwd>
<kwd>machine learning</kwd>
<kwd>deep learning</kwd>
<kwd>telemedicine</kwd>
<kwd>myopia</kwd>
</kwd-group>
<counts>
<fig-count count="2"/>
<table-count count="1"/>
<equation-count count="0"/>
<ref-count count="95"/>
<page-count count="10"/>
<word-count count="8157"/>
</counts>
</article-meta>
</front>
<body>
<sec sec-type="intro" id="s1">
<title>Introduction</title>
<p>With the continuous development of computer technology, big data acquisition and imaging methods, the application of artificial intelligence (AI) in medical fields is expanding. Recently, a large number of AI-related studies have been carried out in many disciplines, such as ophthalmology, radiology, cardiovascularology, and oncology (<xref ref-type="bibr" rid="B1">1</xref>&#x02013;<xref ref-type="bibr" rid="B4">4</xref>). Thanks to the development of multimodal imaging, fundus photography and optical coherence tomography (OCT) have provided rich datasets for the development of AI models and have made it possible for AI to flourish in the field of ophthalmology. The study of diseases has expanded from initial diabetic retinopathy (<xref ref-type="bibr" rid="B5">5</xref>&#x02013;<xref ref-type="bibr" rid="B8">8</xref>), age-related macular degeneration (<xref ref-type="bibr" rid="B9">9</xref>&#x02013;<xref ref-type="bibr" rid="B11">11</xref>), and glaucoma (<xref ref-type="bibr" rid="B12">12</xref>&#x02013;<xref ref-type="bibr" rid="B15">15</xref>) to anterior segment diseases, such as refractive error (<xref ref-type="bibr" rid="B16">16</xref>&#x02013;<xref ref-type="bibr" rid="B18">18</xref>).</p>
<p>Refractive error, represented by myopia, is becoming a key public health issue. As any degree of myopia will increase the risk of adverse changes in eye tissue, high myopia and pathological myopia (PM) significantly increase the risk of irreversible visual impairment [e.g., glaucoma, retinal detachment, myopic macular degeneration (MMD), and macular choroidal neovascularization] or blindness (<xref ref-type="bibr" rid="B19">19</xref>). Early identification of high-risk groups of myopia and regular and repeated follow-up to document the progression of myopia and complications are essential for eye care providers to plan interventions. However, current healthcare systems may not be able to cope with the growing burden. In particular, the COVID-19 pandemic demonstrates the need for remote testing and monitoring. Fortunately, AI technology combined with telemedicine can bridge this gap. To date, studies have integrated AI into all stages of clinical practice of myopia and have achieved positive application effects. This paper introduces the concepts of AI, summarizes the clinical application status, discusses potential challenges and future directions of AI in myopia, and proposes that the establishment of an AI-integrated telemedicine platform will be a new direction of myopia healthcare to provide personalized management throughout the whole process for myopia patients in the post-COVID-19 period.</p>
</sec>
<sec id="s2">
<title>AI, Machine Learning, and Deep Learning</title>
<p>The concept of AI was first proposed by John McCarthy in 1956. Its definition simulates human intelligence through machines (<xref ref-type="bibr" rid="B20">20</xref>). Machine learning (ML) is a branch of AI and mainly uses computer system programming to perform tasks or predict results (<xref ref-type="bibr" rid="B21">21</xref>). ML has great potential in clinical practice and machine translation (<xref ref-type="bibr" rid="B22">22</xref>). Traditional ML algorithms use variables selected by experts as input and usually do not involve large neural networks. They include algorithms such as linear regression, logistic regression, support vector machine, decision tree, and random forest algorithms (<xref ref-type="bibr" rid="B23">23</xref>). Deep learning (DL) is a subset of ML. Without special programming, it can automatically extract the rules from known data for the judgment of unknown data; hence, DL can process more complex data (<xref ref-type="bibr" rid="B24">24</xref>). DL algorithms usually involve the use of large-scale neural networks, such as artificial neural networks (ANNs), convolutional neural networks (CNNs) and recurrent neural networks (RNNs) (<xref ref-type="bibr" rid="B23">23</xref>). Since 2012, the introduction of CNNs has allowed for major breakthroughs in DL in imaging-based applications (e.g., object recognition, image segmentation, and disease classification) (<xref ref-type="bibr" rid="B24">24</xref>). VGG, ResNet, Inception and Inception-ResNet are some of the popular CNNs used for classification and are now widely used in medical image recognition (<xref ref-type="bibr" rid="B23">23</xref>). Deep CNNs can learn the feature representation from data without human knowledge and have the power to process large training data with high dimensionality. Studies have shown that the accuracy of medical image analysis systems based on DL in disease detection is equal to or even better than that of clinicians or trained personnel (<xref ref-type="bibr" rid="B25">25</xref>, <xref ref-type="bibr" rid="B26">26</xref>). Moreover, other studies have proven the potential and feasibility of applying DL algorithms to disease screening and detection (<xref ref-type="bibr" rid="B27">27</xref>, <xref ref-type="bibr" rid="B28">28</xref>). The diagnosis of many ophthalmic diseases requires not only symptom evaluation but also imaging information. This feature leads to the widespread use of AI technology represented by DL in clinical ophthalmology (<xref ref-type="bibr" rid="B1">1</xref>).</p>
<p>The indexes used to evaluate the quality of an AI model are accuracy, sensitivity and specificity, which are calculated by using four quantitative indexes: true positive, false positive, true negative and false negative (<xref ref-type="table" rid="T1">Table 1</xref>). A receiver operating characteristic curve (ROC) can be drawn with the false positive rate (FPR) as the X-axis and the true positive rate (TPR) as the Y-axis. The area under the curve (AUC) is defined as the area under the ROC curve and generally ranges from 0.5 (for a model with no predictive value) to 1 (for a perfect model) (<xref ref-type="bibr" rid="B29">29</xref>) (<xref ref-type="fig" rid="F1">Figure 1</xref>).</p>
<table-wrap position="float" id="T1">
<label>Table 1</label>
<caption><p>Common terminologies used to evaluate AI model performance.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th/>
<th/>
<th valign="top" align="center" colspan="2" style="border-bottom: thin solid #000000;"><bold>Predicted outcome</bold></th>
</tr>
<tr>
<th/>
<th/>
<th valign="top" align="left"><bold>Disease</bold></th>
<th valign="top" align="left"><bold>No disease</bold></th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left"><bold>Actual outcome</bold></td>
<td valign="top" align="left"><bold>Disease</bold></td>
<td valign="top" align="left">True positive (TP)</td>
<td valign="top" align="left">False negative (FN)</td>
</tr>
<tr>
<td/>
<td valign="top" align="left"><bold>No disease</bold></td>
<td valign="top" align="left">False positive (FP)</td>
<td valign="top" align="left">True negative (TN)</td>
</tr>
<tr>
<td valign="top" align="left"><bold>Remark</bold></td>
<td valign="top" align="left" colspan="3">Accuracy = (TP&#x0002B;TN)/(TP&#x0002B;FN&#x0002B;FP&#x0002B;TN)</td>
</tr>
<tr>
<td/>
<td valign="top" align="left" colspan="3">Sensitivity = TP/(TP&#x0002B;TN)</td>
</tr>
<tr>
<td/>
<td valign="top" align="left" colspan="3">Specificity = TN/(TN&#x0002B;FP)</td>
</tr>
<tr>
<td/>
<td valign="top" align="left" colspan="3">True positive rate (TPR) = Sensitivity</td>
</tr>
<tr>
<td/>
<td valign="top" align="left" colspan="3">False positive rate (FPR) = 1-Specificity</td>
</tr>
</tbody>
</table>
</table-wrap>
<fig id="F1" position="float">
<label>Figure 1</label>
<caption><p>Three examples of ROC curve are illustrated. <bold>(A)</bold> AUC=1: A &#x0201C;perfect&#x0201D; classifier; <bold>(B)</bold> 0.5&#x0003C;AUC &#x0003C;1: A real-world classifier, better than random guess; <bold>(C)</bold> AUC=0.5: Like random guess (e.g., coin tossing), models have no predictive value.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fmed-09-840498-g0001.tif"/>
</fig>
</sec>
<sec id="s3">
<title>Global Burden of Myopia</title>
<p>Myopia is one of the most common ophthalmic diseases in the world. It mainly occurs in childhood and early adulthood (<xref ref-type="bibr" rid="B30">30</xref>). According to the work of Holden and his coworkers, the global prevalence of myopia is close to 28.3% (2 billion) of the world&#x00027;s population, of which 4.0% (277 million) suffer from high myopia. The &#x0201C;myopia epidemic&#x0201D; is estimated to affect 49.8% (4.758 billion) of the world&#x00027;s population by 2050, with 9.8% (938 million) suffering from high myopia (&#x02264; -5.00 D). Of note, Holden et al. standardized to a spherical equivalent of 5.00 D or less for high myopia because it is widely used to identify people at higher risk of pathologic myopia (<xref ref-type="bibr" rid="B31">31</xref>). Nature (genetics and heredity) and nurture (environment and lifestyle) are all factors leading to myopia (<xref ref-type="bibr" rid="B19">19</xref>). For most people with myopia, the most critical risk factor is likely to be related to modern lifestyles, which include long periods of close-eye activity. The outbreak of COVID-19 at the end of 2019 undoubtedly exacerbated the above phenomenon. Research shows that during the COVID-19 pandemic, the reduced time spent outdoors and increased exposure to electronic screens have led to a further increase in the risk of myopia in children (<xref ref-type="bibr" rid="B32">32</xref>, <xref ref-type="bibr" rid="B33">33</xref>).</p>
<p>Most cases of myopia are associated with excessive axial growth (<xref ref-type="bibr" rid="B19">19</xref>). Retinal damage caused by excessive axial growth is irreversible. Irreversible visual impairments caused by myopia (e.g., glaucoma, retinal detachment, MMD and macular choroidal neovascularization) or blindness not only increase medical costs but also reduce the quality of life of patients, which has caused a global medical and economic burden. Therefore, it is of great significance to comprehensively carry out myopia healthcare services, including the detection, diagnosis, progression prediction and treatment of myopia, as well as the management and prevention of ocular complications and visual impairment in patients with high myopia.</p>
</sec>
<sec id="s4">
<title>AI in the Detection and Diagnosis of Myopia</title>
<sec>
<title>Refractive Error Assessment</title>
<p>To evaluate refractive error, traditional visual acuity examinations are not only time consuming and laborious but also rely on expensive machines and experienced doctors and technicians. People with expression difficulties (e.g., young children, the elderly, and patients with verbal communication disabilities) have particular difficulties cooperating during an examination (<xref ref-type="bibr" rid="B34">34</xref>). In developing countries or impoverished areas, the lack of doctors and medical equipment makes it difficult to accurately evaluate refractive error, and patients are likely to miss the optimal treatment window, resulting in an irreversible loss of vision. Thus, providing timely and high-quality refraction services that are accepted by the general population is extremely needed.</p>
<p>While it is generally difficult for ophthalmologists to evaluate refractive error from a retinal fundus photograph, DL techniques are capable of predicting them fairly accurately. Varadarajan et al. (<xref ref-type="bibr" rid="B16">16</xref>) trained a DL algorithm to predict refractive error from retinal fundus photographs. By analyzing attention maps to determine the parts of a photograph most relevant for prediction, they concluded that attention maps consistently highlighted the fovea as a feature that was important for prediction. Tan et al. (<xref ref-type="bibr" rid="B35">35</xref>) also reported that by using color fundus photographs, a system consisting of a CNN pretrained with the XGBoost algorithm was able to evaluate refractive error with a high degree of accuracy. Yang et al. (<xref ref-type="bibr" rid="B17">17</xref>) trained a DL system to detect myopia automatically from ocular appearance images, and the system obtained an AUC of 0.9270. The research demonstrated the possibility of screening and monitoring refractive status in children with myopia in remote areas.</p>
</sec>
<sec>
<title>The Diagnosis of Pathologic Myopia and Complications</title>
<p>PM is accompanied by degenerative changes in the retina, which, if left untreated, can lead to irrecoverable vision loss. It is essential for ophthalmologists to have a sustainable method of monitoring eyes with PM to reduce blinding complications, especially given that many PM patients are young or middle aged. However, the diagnosis of PM, defined as peripapillary atrophy and myopic maculopathy, generally requires a complete examination that includes an assessment of the visual acuity and color fundus photograph acquisition tasks that are labor intensive and skill-dependent (<xref ref-type="bibr" rid="B36">36</xref>).</p>
<p>Tan et al. (<xref ref-type="bibr" rid="B37">37</xref>) introduced a method to automatically detect PM <italic>via</italic> peripapillary atrophy features by means of variational level sets from fundus photographs. To improve prediction accuracy, Zhang et al. (<xref ref-type="bibr" rid="B38">38</xref>) proposed a computer-aided framework based on an ML algorithm for the detection of PM. By analyzing demographic and clinical information, retinal fundus photograph data and genotyping data from 2,258 subjects, this method achieved an AUC of 0.888 and outperformed the detection results obtained from the use of demographic and clinical information (an AUC of 0.607), imaging data (an AUC of 0.852) or genotyping data (an AUC of 0.774) alone, with increases of 46.3%, <italic>p</italic> &#x0003C; 0.005; 4.2%, <italic>p</italic> = 0.19; and 14.7%, <italic>P</italic> &#x0003C; 0.005, respectively. Recently, Hemelings et al. (<xref ref-type="bibr" rid="B39">39</xref>) developed a successful approach based on a DL algorithm for the simultaneous detection of PM, with an AUC of 0.9867, and the segmentation of myopia-induced lesions. Other similar studies have also been reported, such as those identifying the different types of lesions of myopic maculopathy automatically from fundus photographs with DL models (<xref ref-type="bibr" rid="B40">40</xref>, <xref ref-type="bibr" rid="B41">41</xref>). In addition, OCT macular images were used for the development of CNN models to identify vision-threatening conditions, such as retinoschisis, macular holes and retinal detachment, in adults with high myopia, and the models obtained good sensitivity and AUC scores (<xref ref-type="bibr" rid="B42">42</xref>, <xref ref-type="bibr" rid="B43">43</xref>).</p>
</sec>
</sec>
<sec id="s5">
<title>AI in the Prediction of Myopia Progression</title>
<p>Considering the potential irreversible disease burden during adulthood, concerns from parents, clinicians and policy makers include the potential progression rate and risk of developing high or even pathological myopia from childhood myopia (<xref ref-type="bibr" rid="B44">44</xref>). Thus, predicting myopia progression can provide evidence for transforming clinical practice, health policy-making, and precise individualized interventions regarding the practical control of school-aged myopia.</p>
<p>Lin et al. (<xref ref-type="bibr" rid="B45">45</xref>) identified myopia development rules and predicted the onset of myopia and its progression for children and teenagers from clinical measures using a random forest ML model, which had good predictive performance (the AUC ranged from 0.801 to 0.837) for up to 8 years in the future. Yang et al. (<xref ref-type="bibr" rid="B46">46</xref>) developed a prediction model to predict myopia in adolescents based on both measurement and behavior data of primary school students, and the model achieved reasonable performance and accuracy. Further research is still required for interpopulation validation to allow these models to be generalized.</p>
</sec>
<sec id="s6">
<title>AI in Refractive Surgery for Myopia</title>
<p>The aim of refractive surgery is to correct refractive error in adults with stable myopia and reduce their dependence on corrective aids. Keratorefractive procedures and intraocular procedures are two main forms of refractive surgery. At present, keratorefractive procedures include laser epithelial keratomileusis (LASEK), laser <italic>in situ</italic> keratomileusis (LASIK) and small incision lenticular extraction (SMILE). Intraocular procedures include phakic intraocular lens (PIOL) implantation and cataract surgery (<xref ref-type="bibr" rid="B19">19</xref>). To achieve the goal of optimal visual and refractive outcomes and to minimize the risk of postoperative complications, researchers have creatively applied AI to various stages of refractive surgery and achieved ideal results, particularly in the preoperative screening for risk of ectasia following LASIK, guiding the formulation of surgical plans and intraocular lens (IOL) power calculations.</p>
<sec>
<title>Preoperative Screening</title>
<p>In 1998, Seiler et al. (<xref ref-type="bibr" rid="B47">47</xref>) published the first reports of iatrogenic progressive ectasia after LASIK, also known as iatrogenic ectasia. This complication can cause postoperative refraction regression and seriously affect the operation effect. Ectasia occurs due to biomechanical decompensation of the stroma, which may be related to pre-existing biomechanical weakening (e.g., keratoconus, subclinical keratoconus, and forme fruste keratoconus) or a severe impact on the corneal structure (e.g., an attempted treatment for high myopia) (<xref ref-type="bibr" rid="B48">48</xref>). Screening before refractive surgeries is extremely important to identify candidates at high risk of iatrogenic ectasia. Xie et al. (<xref ref-type="bibr" rid="B49">49</xref>) combined a DL algorithm with corneal tomographic scans to develop the Pentacam InceptionResNetV2 Screening System to screen potential candidates for refractive surgery. They reported a sensitivity of 80% for identifying ectasia suspects, 90% for diagnosing early keratoconus, and an overall diagnostic accuracy of 95% with an AUC of 0.99. To train and develop more accurate AI-based algorithms for identifying candidates at high risk of iatrogenic ectasia, it is necessary to have a longitudinal follow-up and collect more clinical data to train and validate the AI models.</p>
</sec>
<sec>
<title>Guiding the Formulation of a Surgical Plan</title>
<p>AI technology can guide a surgeon in selecting the best corneal refractive surgery method to perform on a specific patient. Yoo et al. (<xref ref-type="bibr" rid="B50">50</xref>) developed an expert-level multiclass ML model for selecting refractive surgery options for patients. They classified patients into LASEK, LASIK, SMILE and contraindication groups. Using data from 18,480 subjects who intended to undergo refractive surgery, the model was trained to select the optimal refractive surgery type for patients with accuracies of 81 and 78.9% on the internal and external validation datasets, respectively. Cui et al. (<xref ref-type="bibr" rid="B51">51</xref>) developed an ML model to recommend a nomogram for SMILE surgery to achieve the optimal postoperative visual outcome. They reported that the efficacy index in the ML group (1.48 &#x000B1; 1.08) was significantly higher than that in the surgeon group (1.3 &#x000B1; 0.27) (<italic>t</italic> = &#x02212;2.17, <italic>P</italic> &#x0003C; 0.05). For high myopia patients who intend to undergo PIOL surgery, which involves the insertion of an additional lens in the anterior segment, it is essential to have correct anterior chamber depth (ACD) measurement (<xref ref-type="bibr" rid="B52">52</xref>). ACD measurement is usually obtained with conventional A-scan ultrasound. However, these machines are expensive and cumbersome and may not be available in remote areas. Chen et al. (<xref ref-type="bibr" rid="B53">53</xref>) developed a new method for predicting central ACD using a portable smartphone slit lamp device aided by ML. This novel device may provide a new perspective to increase the convenience of ACD measurement.</p>
</sec>
<sec>
<title>IOL Power Calculation Related to Myopia</title>
<p>For patients who intend to undergo PIOL implantation or cataract surgery to correct refractive error, accurate IOL power is the key to improving their postoperative visual quality. Ongoing developments in IOL power calculation incorporate new technology and data science to improve the accuracy of IOL selection (<xref ref-type="bibr" rid="B54">54</xref>). Compared with the second- and third-generation formulas, fourth-generation formulas, such as the Olsen formula (based on ray tracing) and Barrett Universal II (BUII), show good accuracy and fewer refractive accidents (<xref ref-type="bibr" rid="B55">55</xref>). A recent study developed a new XGBoost ML-based calculator for highly myopic eyes, which incorporated the BUII formula results and showed a significant improvement in the percentage of eyes achieving &#x000B1;0.25 D of the prediction error compared with the BUII formula alone (<xref ref-type="bibr" rid="B18">18</xref>). To date, for high axial myopia, AI-based IOL formulas seem to demonstrate higher levels of accuracy, including the Hill-radial basis function (RBF) calculator and the Kane formula (<xref ref-type="bibr" rid="B56">56</xref>&#x02013;<xref ref-type="bibr" rid="B59">59</xref>). The Hill-RBF calculator uses AI and regression analysis with a very large database of actual postsurgical refractive outcomes to predict IOL power (<xref ref-type="bibr" rid="B59">59</xref>). Hill-RBF is based mainly on empirical data; thus, its accuracy is limited by the type of data and eye characteristics from which it is derived (<xref ref-type="bibr" rid="B54">54</xref>). To overcome this limitation, Hill-RBF 2.0 expanded the database and improved IOL power prediction for a wider range of eye characteristics, such as high axial myopia, by continuously collecting various eye characteristics and surgical results (<xref ref-type="bibr" rid="B57">57</xref>). In September 2020, Hill-RBF 3.0 was released. With the expansion of the Hill-RBF database, the calculator is more likely to obtain a better accuracy in IOL power prediction. The other promising method for IOL calculation is the Kane formula, which incorporates AI with theoretical optics to predict IOL power (<xref ref-type="bibr" rid="B54">54</xref>). Studies have shown that the Kane formula has a smaller absolute error than the BUII, Olsen, and Hill-RBF 2.0 formulas (<xref ref-type="bibr" rid="B60">60</xref>, <xref ref-type="bibr" rid="B61">61</xref>). In a study of 10,930 eyes in Britain, the Kane formula had the lowest mean absolute prediction error for all ranges of ALs and obtained the smallest absolute error for long eyes (AL&#x0003E;26.0 mm) (<xref ref-type="bibr" rid="B60">60</xref>).</p>
</sec>
</sec>
<sec id="s7">
<title>AI and Monitoring Devices in the Behavioral Intervention of Myopia</title>
<p>Effective behavioral intervention is as important as early detection to prevent myopia or limit myopia progression. To understand behaviors related to myopic onset and progression, a wearable device named Vivior Monitor (Vivior AG, Zurich, Switzerland) was developed to investigate the visual behavior of children with myopia (6&#x02013;16 years old) (<xref ref-type="bibr" rid="B62">62</xref>). Using ML algorithms, Vivior Monitor identified types of visual activities, such as viewing handheld media, desktop work, and computer work. This research reported that older children spent less time viewing objects at distances, more time using a computer and less time engaging in physical movement. There is no doubt that outdoor activity is the main protective factor against myopia (<xref ref-type="bibr" rid="B63">63</xref>, <xref ref-type="bibr" rid="B64">64</xref>). Wearable devices in combination with internet or social network apps aimed at encouraging children to spend more time outdoors are now being developed. The Singapore Eye Research Institute developed a novel wearable fitness tracker (FitSight), which comprises a smartwatch (Sony Smartwatch 3; Sony Corp., Minato, Tokyo, Japan) with a light sensor and an accompanying smartphone app that logs time spent outdoors and sends feedback to parents and children (<xref ref-type="bibr" rid="B65">65</xref>). In addition, excessive near-work behavior is one of the most commonly known unhealthy visual behaviors related to myopia, and many studies have shown that it can speed the occurrence and development of myopia (<xref ref-type="bibr" rid="B66">66</xref>, <xref ref-type="bibr" rid="B67">67</xref>). Clouclip (Glasson Technology Co. Ltd., Hangzhou, China), a cloud-based sensor device that attaches to the sides of spectacles, can objectively and dynamically monitor the wearer&#x00027;s near-work distance and duration (<xref ref-type="bibr" rid="B68">68</xref>, <xref ref-type="bibr" rid="B69">69</xref>). This device can provide a vibration alert when it detects risky near-work-related behaviors, such as particularly short viewing distances or prolonged continuous near-work behavior. Cao et al. (<xref ref-type="bibr" rid="B68">68</xref>) collected data from 67 subjects who were assigned to wear Clouclip all day (except for bathing and sleeping) during the experiment; they found that the device can significantly modify near-work behaviors in school-age children and that its effects can last a certain period of time.</p>
</sec>
<sec id="s8">
<title>AI in Myopia Genetics</title>
<p>The mechanism of myopia is extremely complex. Nature (genetics and heredity) and nurture (environment and lifestyle) are all factors leading to myopia (<xref ref-type="bibr" rid="B19">19</xref>). In recent years, studies on the genetics of myopia have also received considerable attention. By linkage analysis, candidate gene analysis, genome-wide association study (GWAS) and next-generation sequencing (NGS), more than 100 genes and over 20 chromosomal loci have been identified to be associated with myopia or related quantitative traits (<xref ref-type="bibr" rid="B70">70</xref>&#x02013;<xref ref-type="bibr" rid="B72">72</xref>). However, the current knowledge about the genetic contributions of the loci and genes to myopia remains limited (<xref ref-type="bibr" rid="B73">73</xref>).</p>
<p>To date, studies using big data for genetic analysis and phenotyping correlation have achieved significant progress in various medical fields (<xref ref-type="bibr" rid="B74">74</xref>, <xref ref-type="bibr" rid="B75">75</xref>). Genomic readouts, combined with advanced AI, could be a powerful approach for risk prediction in multifactorial diseases such as myopia. At present, both CNNs and RNNs have shown considerable potential in a variety of clinical genomics applications, such as variant calling, genome annotation, and functional impact prediction (<xref ref-type="bibr" rid="B76">76</xref>). Given the diversity of myopia with regard to its environmental burden, geographic patterns, and affiliations with different ethnicities and cultural groups worldwide (<xref ref-type="bibr" rid="B73">73</xref>), further AI research with larger multiethnic genetic samples from various research institutes will be essential to drive the discovery of new insights into the genetic aspects of myopia and advance AI-genomic applications in managing childhood myopia (<xref ref-type="bibr" rid="B77">77</xref>).</p>
</sec>
<sec id="s9">
<title>New Model For Myopia Management: Telemedicine</title>
<p>Telemedicine is a new service model in the medical field that aims to solve the problem of healthcare for people in remote and underdeveloped areas by providing remote medical services (<xref ref-type="bibr" rid="B78">78</xref>). The global COVID-19 pandemic is bringing telemedicine to the forefront of ophthalmic medical services (<xref ref-type="bibr" rid="B79">79</xref>, <xref ref-type="bibr" rid="B80">80</xref>). With the development of AI technology and the expansion of 5G communication network coverage, AI-integrated telemedicine platforms will gradually become the new normal of post-COVID-19 ophthalmic care. In the clinical application of myopia, AI-integrated telemedicine platforms should mainly focus on the following aspects: reducing the manpower requirements of ophthalmic clinics, reducing the risk of direct physical contact between patients and doctors, and personalizing management throughout the whole process.</p>
<p>Devices based on AI enable non-ophthalmologist health care workers, such as optometrists, nurses and technicians, to perform several tasks, such as assessments of refractive error and measurements of ACDs, individually instead of patients moving through a number of different clinical staff, each performing a specific task. In addition, telemedicine can not only reduce the direct physical contact between patients and doctors but also prolong the distance of ophthalmic examination. For example, the portable slit lamp examination distance has increased from 18 cm to 55 cm. The examination distance increased from 5 cm for the direct ophthalmoscope to 47 cm for the Glasgow Retinal Imaging Adaptor (Medical Devices Unit, NHS Greater Glasgow &#x00026; Clyde, UK) (<xref ref-type="bibr" rid="B81">81</xref>). These changes can not only satisfy the need for regular and repeated follow-up to monitor and document the refractive status of myopia with high efficiency but also limit exposure risks.</p>
<p>To provide personalized management for myopia patients throughout the whole process, we first need to realize the integration of hospital-community-family health management. Recently, Wu et al. (<xref ref-type="bibr" rid="B82">82</xref>) proposed an AI-integrated telemedicine platform to screen and refer patients with cataracts. According to the authors, this telemedicine platform involves self-monitoring at home, primary healthcare and specialized hospital services. Inspired by this platform, we propose a new management model for myopia (<xref ref-type="fig" rid="F2">Figure 2</xref>). First, considering that myopia develops primarily during childhood and early adulthood, large-scale refractive error screening of the target population will be carried out regularly with portable devices and technologies based on AI, and the examination data will be stored and documented on telemedicine platforms. Second, AI analysis will be conducted on the collected clinical data, images and potential genomic data to classify the risk of myopia progression in clinically identified individuals and formulate personalized management plans, including visual behavioral interventions for patients with wearable devices (<xref ref-type="bibr" rid="B77">77</xref>). Third, home monitoring (using ocular appearance images taken by family members with cell phones and visual acuity tests) can be implemented for patients without myopia-related complications. Home monitoring and community-based primary healthcare institutions (where retinal fundus photographs or OCT scans are captured and used in the telemedicine platform with AI analysis) can be used by myopia patients with non-blinding myopia complications. If the above patients develop pathological myopia or myopia with blinding complications, they can be referred to the specialized hospital <italic>via</italic> a fast tract notification system. Patients initially diagnosed with pathological myopia or blinding complications should be directly transferred to tertiary medical institutions. After treatment, the patient returned home and continued home monitoring. Fourth, for patients requiring surgical treatment, AI-integrated telemedicine can be applied to preoperative screening to determine the risk of ectasia following LASIK and guide the formulation of a surgical plan and IOL power calculation.</p>
<fig id="F2" position="float">
<label>Figure 2</label>
<caption><p>The workflow of AI-integrated telemedicine platform for myopia. <bold>(A)</bold> is the workflow of initial grouping, including myopia screening, files establishing, AI analysis and progression risk stratification for myopia patients. <bold>(B)</bold> is the workflow of continuous management involving self-monitoring at home, primary healthcare and specialized hospital services.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fmed-09-840498-g0002.tif"/>
</fig>
</sec>
<sec id="s10">
<title>Current Challenges and Future Directions</title>
<p>Despite the reported successful clinical applications of AI in myopia, challenges and hurdles are still present. Critical technical and clinical limitations must be surmounted prior to the widespread implementation of AI in myopia.</p>
<sec>
<title>Standardization of Datasets</title>
<p>Image-based AI technology has made some progress in the application of refractive error assessment, screening, diagnosis and treatment of myopia. However, image-based AI requires large, standardized, labeled data, and ophthalmic open datasets are very small compared to ImageNet&#x00027;s tens of millions of images (<xref ref-type="bibr" rid="B13">13</xref>). Obtaining large-scale and high-quality images in a real clinical environment is a great challenge. Technically, more advanced data enhancement methods should be utilized, such as programming simulated lesions to be integrated into normal image data (<xref ref-type="bibr" rid="B83">83</xref>) or incorporating real lesions into other locations in normal or abnormal images (<xref ref-type="bibr" rid="B84">84</xref>). Recent studies have proposed alternative training methods that can learn from less data. For example, some studies synthesize a large number of random and diverse medical images by generative adversarial networks and report that these images can be used as CNN training datasets in the future (<xref ref-type="bibr" rid="B85">85</xref>&#x02013;<xref ref-type="bibr" rid="B87">87</xref>). However, these new methods have not achieved significant success thus far, and their effectiveness needs to be further proven (<xref ref-type="bibr" rid="B88">88</xref>). In addition to the amount of data, the quality of images also plays a great part in the performance of AI models (<xref ref-type="bibr" rid="B5">5</xref>, <xref ref-type="bibr" rid="B89">89</xref>). Research has reported that poor-quality fundus images that were not removed from the dataset were found to decrease the AUC by 0.1 (<xref ref-type="bibr" rid="B90">90</xref>). To surmount this challenge, Wu et al. (<xref ref-type="bibr" rid="B23">23</xref>) proposed a quality assessment system for images to select high-quality images. The feasibility of this method needs further study.</p>
</sec>
<sec>
<title>Attitude Toward AI</title>
<p>As DL is an end-to-end learning method, that is, inputting original data and outputting results directly without manual coding, DL lacks the ability to explain the detection results and cannot provide an exact judgment basis for the results; this is called the &#x0201C;black box phenomenon.&#x0201D; This could reduce the acceptance of test results by ophthalmologists and patients (<xref ref-type="bibr" rid="B91">91</xref>). With the development of DL, several approaches are currently available to help improve the interpretability of the results, including occlusion tests (<xref ref-type="bibr" rid="B92">92</xref>) and saliency maps (<xref ref-type="bibr" rid="B93">93</xref>). However, there is no consensus on which saliency map generation method is most appropriate for ophthalmic imaging data (<xref ref-type="bibr" rid="B93">93</xref>). In addition, it is unclear how one should interpret non-traditional features identified by saliency analysis, that is, whether they should be treated as novel biomarkers or erroneous correlations &#x0201C;learned&#x0201D; during training. Processes need to be in place to address such disagreements, such as an independent third party from a multidisciplinary team, as would occur where there is clinical uncertainty (<xref ref-type="bibr" rid="B36">36</xref>). Apart from that, education on the implementation and appraisal of AI systems should be included in medical school programs and hospital training to prepare for its adoption when the technology reaches maturation for ophthalmology clinical practice.</p>
</sec>
<sec>
<title>Ethical, Legal, and Regulatory Issues</title>
<p>With the increasing use of AI, security and privacy have become issues of concern and involve ethical, legal and regulatory issues (<xref ref-type="bibr" rid="B94">94</xref>). For example, an AI algorithm, similar to a human ophthalmologist, is definitely prone to errors. Who is responsible for bearing the legal consequences of an undesirable outcome due to an erroneous judgment made by an AI algorithm? Is it the company that develops the algorithm, the individual physician who utilizes the algorithm, or the healthcare organization under which the physician is employed (<xref ref-type="bibr" rid="B95">95</xref>)? In addition, protocols and laws aimed at guaranteeing training data and testing data security in AI need to be continually established and improved.</p>
</sec>
</sec>
<sec sec-type="conclusions" id="s11">
<title>Conclusions</title>
<p>Given the rapid increases in the prevalence of all levels of myopia in the past three decades and the non-linear rapid COVID-19 disease expansion, there is a need to revolutionize healthcare systems worldwide. Three main areas are the targets for such revolutions: improving efficiency, limiting exposure risk, and providing individualized management for myopic patients. AI is among the most promising solutions to address these issues. Prior to the mass adoption of AI in myopia, AI models need to be further optimized to improve their interpretability, human&#x02013;machine interactions, generalization abilities, and robustness. It is also necessary to develop relevant clinical standards, integrate large-scale clinical datasets, and develop a standard evaluation framework for AI models in clinical practice. Moreover, relevant laws and regulations need to be constantly improved to achieve comprehensive supervision of practical applications.</p>
</sec>
<sec id="s12">
<title>Author Contributions</title>
<p>CZ, JZ, and ZZ conceived the idea for the article and contributed to the initial drafting of the manuscript. YL, KL, and YW performed literature search and collected data. YZ was involved in reviewing and editing the manuscript. All authors read and approved the final manuscript.</p>
</sec>
<sec sec-type="funding-information" id="s13">
<title>Funding</title>
<p>This work was supported by the Science and Technology Development Plan Project of Jilin Province (No. 20190303186SF).</p>
</sec>
<sec sec-type="COI-statement" id="conf1">
<title>Conflict of Interest</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
<sec sec-type="disclaimer" id="s14">
<title>Publisher&#x00027;s Note</title>
<p>All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.</p>
</sec>
</body>
<back>
<ref-list>
<title>References</title>
<ref id="B1">
<label>1.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ting</surname> <given-names>DSW</given-names></name> <name><surname>Peng</surname> <given-names>L</given-names></name> <name><surname>Varadarajan</surname> <given-names>AV</given-names></name> <name><surname>Keane</surname> <given-names>PA</given-names></name> <name><surname>Burlina</surname> <given-names>PM</given-names></name> <name><surname>Chiang</surname> <given-names>MF</given-names></name> <etal/></person-group>. <article-title>Deep learning in ophthalmology: the technical and clinical considerations</article-title>. <source>Prog Retin Eye Res.</source> (<year>2019</year>) <volume>72</volume>:<fpage>100759</fpage>. <pub-id pub-id-type="doi">10.1016/j.preteyeres.2019.04.003</pub-id><pub-id pub-id-type="pmid">31048019</pub-id></citation></ref>
<ref id="B2">
<label>2.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Saba</surname> <given-names>L</given-names></name> <name><surname>Biswas</surname> <given-names>M</given-names></name> <name><surname>Kuppili</surname> <given-names>V</given-names></name> <name><surname>Cuadrado Godia</surname> <given-names>E</given-names></name> <name><surname>Suri</surname> <given-names>HS</given-names></name> <name><surname>Edla</surname> <given-names>DR</given-names></name> <etal/></person-group>. <article-title>The present and future of deep learning in radiology</article-title>. <source>Eur J Radiol.</source> (<year>2019</year>) <volume>114</volume>:<fpage>14</fpage>&#x02013;<lpage>24</lpage>. <pub-id pub-id-type="doi">10.1016/j.ejrad.2019.02.038</pub-id><pub-id pub-id-type="pmid">31005165</pub-id></citation></ref>
<ref id="B3">
<label>3.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dey</surname> <given-names>D</given-names></name> <name><surname>Slomka</surname> <given-names>PJ</given-names></name> <name><surname>Leeson</surname> <given-names>P</given-names></name> <name><surname>Comaniciu</surname> <given-names>D</given-names></name> <name><surname>Shrestha</surname> <given-names>S</given-names></name> <name><surname>Sengupta</surname> <given-names>PP</given-names></name> <etal/></person-group>. <article-title>Artificial intelligence in cardiovascular imaging: JACC state-of-the-art review</article-title>. <source>J Am Coll Cardiol.</source> (<year>2019</year>) <volume>73</volume>:<fpage>1317</fpage>&#x02013;<lpage>35</lpage>. <pub-id pub-id-type="doi">10.1016/j.jacc.2018.12.054</pub-id><pub-id pub-id-type="pmid">30898208</pub-id></citation></ref>
<ref id="B4">
<label>4.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bi</surname> <given-names>WL</given-names></name> <name><surname>Hosny</surname> <given-names>A</given-names></name> <name><surname>Schabath</surname> <given-names>MB</given-names></name> <name><surname>Giger</surname> <given-names>ML</given-names></name> <name><surname>Birkbak</surname> <given-names>NJ</given-names></name> <name><surname>Mehrtash</surname> <given-names>A</given-names></name> <etal/></person-group>. <article-title>Artificial intelligence in cancer imaging: clinical challenges and applications</article-title>. <source>CA Cancer J Clin.</source> (<year>2019</year>) <volume>69</volume>:<fpage>127</fpage>&#x02013;<lpage>57</lpage>. <pub-id pub-id-type="doi">10.3322/caac.21552</pub-id><pub-id pub-id-type="pmid">30720861</pub-id></citation></ref>
<ref id="B5">
<label>5.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ting</surname> <given-names>DSW</given-names></name> <name><surname>Cheung</surname> <given-names>CY</given-names></name> <name><surname>Lim</surname> <given-names>G</given-names></name> <name><surname>Tan</surname> <given-names>GSW</given-names></name> <name><surname>Quang</surname> <given-names>ND</given-names></name> <name><surname>Gan</surname> <given-names>A</given-names></name> <etal/></person-group>. <article-title>Development and validation of a deep learning system for diabetic retinopathy and related eye diseases using retinal images from multiethnic populations with diabetes</article-title>. <source>JAMA.</source> (<year>2017</year>) <volume>318</volume>:<fpage>2211</fpage>&#x02013;<lpage>23</lpage>. <pub-id pub-id-type="doi">10.1001/jama.2017.18152</pub-id><pub-id pub-id-type="pmid">29234807</pub-id></citation></ref>
<ref id="B6">
<label>6.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gulshan</surname> <given-names>V</given-names></name> <name><surname>Peng</surname> <given-names>L</given-names></name> <name><surname>Coram</surname> <given-names>M</given-names></name> <name><surname>Stumpe</surname> <given-names>MC</given-names></name> <name><surname>Wu</surname> <given-names>D</given-names></name> <name><surname>Narayanaswamy</surname> <given-names>A</given-names></name> <etal/></person-group>. <article-title>Development and validation of a deep learning algorithm for detection of diabetic retinopathy in retinal fundus photographs</article-title>. <source>JAMA.</source> (<year>2016</year>) <volume>316</volume>:<fpage>2402</fpage>&#x02013;<lpage>10</lpage>. <pub-id pub-id-type="doi">10.1001/jama.2016.17216</pub-id><pub-id pub-id-type="pmid">31170223</pub-id></citation></ref>
<ref id="B7">
<label>7.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gargeya</surname> <given-names>R</given-names></name> <name><surname>Leng</surname> <given-names>T</given-names></name></person-group>. <article-title>Automated identification of diabetic retinopathy using deep learning</article-title>. <source>Ophthalmology.</source> (<year>2017</year>) <volume>124</volume>:<fpage>962</fpage>&#x02013;<lpage>9</lpage>. <pub-id pub-id-type="doi">10.1016/j.ophtha.2017.02.008</pub-id><pub-id pub-id-type="pmid">28359545</pub-id></citation></ref>
<ref id="B8">
<label>8.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Abr&#x000E0;moff</surname> <given-names>MD</given-names></name> <name><surname>Lou</surname> <given-names>Y</given-names></name> <name><surname>Erginay</surname> <given-names>A</given-names></name> <name><surname>Clarida</surname> <given-names>W</given-names></name> <name><surname>Amelon</surname> <given-names>R</given-names></name> <name><surname>Folk</surname> <given-names>JC</given-names></name> <etal/></person-group>. <article-title>Improved automated detection of diabetic retinopathy on a publicly available dataset through integration of deep learning</article-title>. <source>Invest Ophthalmol Vis Sci.</source> (<year>2016</year>) <volume>57</volume>:<fpage>5200</fpage>&#x02013;<lpage>6</lpage>. <pub-id pub-id-type="doi">10.1167/iovs.16-19964</pub-id><pub-id pub-id-type="pmid">27701631</pub-id></citation></ref>
<ref id="B9">
<label>9.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Burlina</surname> <given-names>PM</given-names></name> <name><surname>Joshi</surname> <given-names>N</given-names></name> <name><surname>Pekala</surname> <given-names>M</given-names></name> <name><surname>Pacheco</surname> <given-names>KD</given-names></name> <name><surname>Freund</surname> <given-names>DE</given-names></name> <name><surname>Bressler</surname> <given-names>NM</given-names></name></person-group>. <article-title>Automated grading of age-related macular degeneration from color fundus images using deep convolutional neural networks</article-title>. <source>JAMA Ophthalmol.</source> (<year>2017</year>) <volume>135</volume>:<fpage>1170</fpage>&#x02013;<lpage>6</lpage>. <pub-id pub-id-type="doi">10.1001/jamaophthalmol.2017.3782</pub-id><pub-id pub-id-type="pmid">28973096</pub-id></citation></ref>
<ref id="B10">
<label>10.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Treder</surname> <given-names>M</given-names></name> <name><surname>Lauermann</surname> <given-names>JL</given-names></name> <name><surname>Eter</surname> <given-names>N</given-names></name></person-group>. <article-title>Automated detection of exudative age-related macular degeneration in spectral domain optical coherence tomography using deep learning</article-title>. <source>Graefes Arch Clin Exp Ophthalmol.</source> (<year>2018</year>) <volume>256</volume>:<fpage>259</fpage>&#x02013;<lpage>65</lpage>. <pub-id pub-id-type="doi">10.1007/s00417-017-3850-3</pub-id><pub-id pub-id-type="pmid">29159541</pub-id></citation></ref>
<ref id="B11">
<label>11.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schlegl</surname> <given-names>T</given-names></name> <name><surname>Waldstein</surname> <given-names>SM</given-names></name> <name><surname>Bogunovic</surname> <given-names>H</given-names></name> <name><surname>Endstra&#x000DF;er</surname> <given-names>F</given-names></name> <name><surname>Sadeghipour</surname> <given-names>A</given-names></name> <name><surname>Philip</surname> <given-names>AM</given-names></name> <etal/></person-group>. <article-title>Fully automated detection and quantification of macular fluid in OCT using deep learning</article-title>. <source>Ophthalmology.</source> (<year>2018</year>) <volume>125</volume>:<fpage>549</fpage>&#x02013;<lpage>58</lpage>. <pub-id pub-id-type="doi">10.1016/j.ophtha.2017.10.031</pub-id><pub-id pub-id-type="pmid">29224926</pub-id></citation></ref>
<ref id="B12">
<label>12.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Asaoka</surname> <given-names>R</given-names></name> <name><surname>Murata</surname> <given-names>H</given-names></name> <name><surname>Hirasawa</surname> <given-names>K</given-names></name> <name><surname>Fujino</surname> <given-names>Y</given-names></name> <name><surname>Matsuura</surname> <given-names>M</given-names></name> <name><surname>Miki</surname> <given-names>A</given-names></name> <etal/></person-group>. <article-title>Using deep learning and transfer learning to accurately diagnose early-onset glaucoma from macular optical coherence tomography images</article-title>. <source>Am J Ophthalmol.</source> (<year>2019</year>) <volume>198</volume>:<fpage>136</fpage>&#x02013;<lpage>45</lpage>. <pub-id pub-id-type="doi">10.1016/j.ajo.2018.10.007</pub-id><pub-id pub-id-type="pmid">30316669</pub-id></citation></ref>
<ref id="B13">
<label>13.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Shibata</surname> <given-names>N</given-names></name> <name><surname>Tanito</surname> <given-names>M</given-names></name> <name><surname>Mitsuhashi</surname> <given-names>K</given-names></name> <name><surname>Fujino</surname> <given-names>Y</given-names></name> <name><surname>Matsuura</surname> <given-names>M</given-names></name> <name><surname>Murata</surname> <given-names>H</given-names></name> <etal/></person-group>. <article-title>Development of a deep residual learning algorithm to screen for glaucoma from fundus photography</article-title>. <source>Sci Rep.</source> (<year>2018</year>) <volume>8</volume>:<fpage>14665</fpage>. <pub-id pub-id-type="doi">10.1038/s41598-018-33013-w</pub-id><pub-id pub-id-type="pmid">30279554</pub-id></citation></ref>
<ref id="B14">
<label>14.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hood</surname> <given-names>DC</given-names></name> <name><surname>De Moraes</surname> <given-names>CG</given-names></name></person-group>. <article-title>Efficacy of a deep learning system for detecting glaucomatous optic neuropathy based on color fundus photographs</article-title>. <source>Ophthalmology.</source> (<year>2018</year>) <volume>125</volume>:<fpage>1207</fpage>&#x02013;<lpage>8</lpage>. <pub-id pub-id-type="doi">10.1016/j.ophtha.2018.04.020</pub-id><pub-id pub-id-type="pmid">30032794</pub-id></citation></ref>
<ref id="B15">
<label>15.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Asaoka</surname> <given-names>R</given-names></name> <name><surname>Murata</surname> <given-names>H</given-names></name> <name><surname>Iwase</surname> <given-names>A</given-names></name> <name><surname>Araie</surname> <given-names>M</given-names></name></person-group>. <article-title>Detecting preperimetric glaucoma with standard automated perimetry using a deep learning classifier</article-title>. <source>Ophthalmology.</source> (<year>2016</year>) <volume>123</volume>:<fpage>1974</fpage>&#x02013;<lpage>80</lpage>. <pub-id pub-id-type="doi">10.1016/j.ophtha.2016.05.029</pub-id><pub-id pub-id-type="pmid">27395766</pub-id></citation></ref>
<ref id="B16">
<label>16.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Varadarajan</surname> <given-names>AV</given-names></name> <name><surname>Poplin</surname> <given-names>R</given-names></name> <name><surname>Blumer</surname> <given-names>K</given-names></name> <name><surname>Angermueller</surname> <given-names>C</given-names></name> <name><surname>Ledsam</surname> <given-names>J</given-names></name> <name><surname>Chopra</surname> <given-names>R</given-names></name> <etal/></person-group>. <article-title>Deep learning for predicting refractive error from retinal fundus images</article-title>. <source>Invest Ophthalmol Vis Sci.</source> (<year>2018</year>) <volume>59</volume>:<fpage>2861</fpage>&#x02013;<lpage>8</lpage>. <pub-id pub-id-type="doi">10.1167/iovs.18-23887</pub-id><pub-id pub-id-type="pmid">30025129</pub-id></citation></ref>
<ref id="B17">
<label>17.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yang</surname> <given-names>Y</given-names></name> <name><surname>Li</surname> <given-names>R</given-names></name> <name><surname>Lin</surname> <given-names>D</given-names></name> <name><surname>Zhang</surname> <given-names>X</given-names></name> <name><surname>Li</surname> <given-names>W</given-names></name> <name><surname>Wang</surname> <given-names>J</given-names></name> <etal/></person-group>. <article-title>Automatic identification of myopia based on ocular appearance images using deep learning</article-title>. <source>Ann Transl Med.</source> (<year>2020</year>) <volume>8</volume>:<fpage>705</fpage>. <pub-id pub-id-type="doi">10.21037/atm.2019.12.39</pub-id><pub-id pub-id-type="pmid">32617325</pub-id></citation></ref>
<ref id="B18">
<label>18.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rampat</surname> <given-names>R</given-names></name> <name><surname>Deshmukh</surname> <given-names>R</given-names></name> <name><surname>Chen</surname> <given-names>X</given-names></name> <name><surname>Ting</surname> <given-names>DSW</given-names></name> <name><surname>Said</surname> <given-names>DG</given-names></name> <name><surname>Dua</surname> <given-names>HS</given-names></name> <etal/></person-group>. <article-title>Artificial intelligence in cornea, refractive surgery, and cataract: basic principles, clinical applications, and future directions</article-title>. <source>Asia Pac J Ophthalmol.</source> (<year>2021</year>) <volume>10</volume>:<fpage>268</fpage>&#x02013;<lpage>81</lpage>. <pub-id pub-id-type="doi">10.1097/apo.0000000000000394</pub-id><pub-id pub-id-type="pmid">34224467</pub-id></citation></ref>
<ref id="B19">
<label>19.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Baird</surname> <given-names>PN</given-names></name> <name><surname>Saw</surname> <given-names>SM</given-names></name> <name><surname>Lanca</surname> <given-names>C</given-names></name> <name><surname>Guggenheim</surname> <given-names>JA</given-names></name> <name><surname>Smith Iii</surname> <given-names>EL</given-names></name> <name><surname>Zhou</surname> <given-names>X</given-names></name> <etal/></person-group>. <article-title>Myopia</article-title>. <source>Nat Rev Dis Primers.</source> (<year>2020</year>) <volume>6</volume>:<fpage>99</fpage>. <pub-id pub-id-type="doi">10.1038/s41572-020-00231-4</pub-id><pub-id pub-id-type="pmid">33328468</pub-id></citation></ref>
<ref id="B20">
<label>20.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mccarthy</surname> <given-names>J</given-names></name> <name><surname>Minsky</surname> <given-names>M</given-names></name> <name><surname>Rochester</surname> <given-names>N</given-names></name> <name><surname>Shannon</surname> <given-names>CEJAM</given-names></name></person-group>. <article-title>A proposal for the Dartmouth summer research project on artificial intelligence, August 31, 1955</article-title>. <source>AI Magazine</source>. (<year>2006</year>) <volume>27</volume>:<fpage>12</fpage>&#x02013;<lpage>4</lpage>. <pub-id pub-id-type="doi">10.1609/aimag.v27i4.1904</pub-id></citation>
</ref>
<ref id="B21">
<label>21.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ajani</surname> <given-names>TS</given-names></name> <name><surname>Imoize</surname> <given-names>AL</given-names></name> <name><surname>Atayero</surname> <given-names>AA</given-names></name></person-group>. <article-title>An overview of machine learning within embedded and mobile devices-optimizations and applications</article-title>. <source>Sensors.</source> (<year>2021</year>) <volume>21</volume>:<fpage>412</fpage>. <pub-id pub-id-type="doi">10.3390/s21134412</pub-id><pub-id pub-id-type="pmid">34203119</pub-id></citation></ref>
<ref id="B22">
<label>22.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bengio</surname> <given-names>Y</given-names></name> <name><surname>Courville</surname> <given-names>A</given-names></name> <name><surname>Vincent</surname> <given-names>P</given-names></name></person-group>. <article-title>Representation learning: a review and new perspectives</article-title>. <source>IEEE Trans Pattern Anal Mach Intell.</source> (<year>2013</year>) <volume>35</volume>:<fpage>1798</fpage>&#x02013;<lpage>828</lpage>. <pub-id pub-id-type="doi">10.1109/tpami.2013.50</pub-id><pub-id pub-id-type="pmid">23787338</pub-id></citation></ref>
<ref id="B23">
<label>23.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wu</surname> <given-names>X</given-names></name> <name><surname>Liu</surname> <given-names>L</given-names></name> <name><surname>Zhao</surname> <given-names>L</given-names></name> <name><surname>Guo</surname> <given-names>C</given-names></name> <name><surname>Li</surname> <given-names>R</given-names></name> <name><surname>Wang</surname> <given-names>T</given-names></name> <etal/></person-group>. <article-title>Application of artificial intelligence in anterior segment ophthalmic diseases: diversity and standardization</article-title>. <source>Ann Transl Med.</source> (<year>2020</year>) <volume>8</volume>:<fpage>714</fpage>. <pub-id pub-id-type="doi">10.21037/atm-20-976</pub-id><pub-id pub-id-type="pmid">32617334</pub-id></citation></ref>
<ref id="B24">
<label>24.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>LeCun</surname> <given-names>Y</given-names></name> <name><surname>Bengio</surname> <given-names>Y</given-names></name> <name><surname>Hinton</surname> <given-names>G</given-names></name></person-group>. <article-title>Deep learning</article-title>. <source>Nature.</source> (<year>2015</year>) <volume>521</volume>:<fpage>436</fpage>&#x02013;<lpage>44</lpage>. <pub-id pub-id-type="doi">10.1038/nature14539</pub-id><pub-id pub-id-type="pmid">26017442</pub-id></citation></ref>
<ref id="B25">
<label>25.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hekler</surname> <given-names>A</given-names></name> <name><surname>Utikal</surname> <given-names>JS</given-names></name> <name><surname>Enk</surname> <given-names>AH</given-names></name> <name><surname>Solass</surname> <given-names>W</given-names></name> <name><surname>Schmitt</surname> <given-names>M</given-names></name> <name><surname>Klode</surname> <given-names>J</given-names></name> <etal/></person-group>. <article-title>Deep learning outperformed 11 pathologists in the classification of histopathological melanoma images</article-title>. <source>Eur J Cancer.</source> (<year>2019</year>) <volume>118</volume>:<fpage>91</fpage>&#x02013;<lpage>6</lpage>. <pub-id pub-id-type="doi">10.1016/j.ejca.2019.06.012</pub-id><pub-id pub-id-type="pmid">32008919</pub-id></citation></ref>
<ref id="B26">
<label>26.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Maron</surname> <given-names>RC</given-names></name> <name><surname>Weichenthal</surname> <given-names>M</given-names></name> <name><surname>Utikal</surname> <given-names>JS</given-names></name> <name><surname>Hekler</surname> <given-names>A</given-names></name> <name><surname>Berking</surname> <given-names>C</given-names></name> <name><surname>Hauschild</surname> <given-names>A</given-names></name> <etal/></person-group>. <article-title>Systematic outperformance of 112 dermatologists in multiclass skin cancer image classification by convolutional neural networks</article-title>. <source>Eur J Cancer.</source> (<year>2019</year>) <volume>119</volume>:<fpage>57</fpage>&#x02013;<lpage>65</lpage>. <pub-id pub-id-type="doi">10.1016/j.ejca.2019.06.013</pub-id><pub-id pub-id-type="pmid">31419752</pub-id></citation></ref>
<ref id="B27">
<label>27.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dascalu</surname> <given-names>A</given-names></name> <name><surname>David</surname> <given-names>EO</given-names></name></person-group>. <article-title>Skin cancer detection by deep learning and sound analysis algorithms: a prospective clinical study of an elementary dermoscope</article-title>. <source>EBioMedicine.</source> (<year>2019</year>) <volume>43</volume>:<fpage>107</fpage>&#x02013;<lpage>13</lpage>. <pub-id pub-id-type="doi">10.1016/j.ebiom.2019.04.055</pub-id><pub-id pub-id-type="pmid">31101596</pub-id></citation></ref>
<ref id="B28">
<label>28.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Al-Antari</surname> <given-names>MA</given-names></name> <name><surname>Al-Masni</surname> <given-names>MA</given-names></name> <name><surname>Kim</surname> <given-names>TS</given-names></name></person-group>. <article-title>Deep learning computer-aided diagnosis for breast lesion in digital mammogram</article-title>. <source>Adv Exp Med Biol.</source> (<year>2020</year>) <volume>1213</volume>:<fpage>59</fpage>&#x02013;<lpage>72</lpage>. <pub-id pub-id-type="doi">10.1007/978-3-030-33128-3_4</pub-id><pub-id pub-id-type="pmid">32030663</pub-id></citation></ref>
<ref id="B29">
<label>29.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Deo</surname> <given-names>RC</given-names></name></person-group>. <article-title>Machine learning in medicine</article-title>. <source>Circulation.</source> (<year>2015</year>) <volume>132</volume>:<fpage>1920</fpage>&#x02013;<lpage>30</lpage>. <pub-id pub-id-type="doi">10.1161/circulationaha.115.001593</pub-id><pub-id pub-id-type="pmid">26572668</pub-id></citation></ref>
<ref id="B30">
<label>30.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Morgan</surname> <given-names>IG</given-names></name> <name><surname>French</surname> <given-names>AN</given-names></name> <name><surname>Ashby</surname> <given-names>RS</given-names></name> <name><surname>Guo</surname> <given-names>X</given-names></name> <name><surname>Ding</surname> <given-names>X</given-names></name> <name><surname>He</surname> <given-names>M</given-names></name> <etal/></person-group>. <article-title>The epidemics of myopia: Aetiology and prevention</article-title>. <source>Prog Retin Eye Res.</source> (<year>2018</year>) <volume>62</volume>:<fpage>134</fpage>&#x02013;<lpage>49</lpage>. <pub-id pub-id-type="doi">10.1016/j.preteyeres.2017.09.004</pub-id><pub-id pub-id-type="pmid">28951126</pub-id></citation></ref>
<ref id="B31">
<label>31.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Holden</surname> <given-names>BA</given-names></name> <name><surname>Fricke</surname> <given-names>TR</given-names></name> <name><surname>Wilson</surname> <given-names>DA</given-names></name> <name><surname>Jong</surname> <given-names>M</given-names></name> <name><surname>Naidoo</surname> <given-names>KS</given-names></name> <name><surname>Sankaridurg</surname> <given-names>P</given-names></name> <etal/></person-group>. <article-title>Global prevalence of myopia and high myopia and temporal trends from 2000 through 2050</article-title>. <source>Ophthalmology.</source> (<year>2016</year>) <volume>123</volume>:<fpage>1036</fpage>&#x02013;<lpage>42</lpage>. <pub-id pub-id-type="doi">10.1016/j.ophtha.2016.01.006</pub-id><pub-id pub-id-type="pmid">28219507</pub-id></citation></ref>
<ref id="B32">
<label>32.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wong</surname> <given-names>CW</given-names></name> <name><surname>Tsai</surname> <given-names>A</given-names></name> <name><surname>Jonas</surname> <given-names>JB</given-names></name> <name><surname>Ohno-Matsui</surname> <given-names>K</given-names></name> <name><surname>Chen</surname> <given-names>J</given-names></name> <name><surname>Ang</surname> <given-names>M</given-names></name> <etal/></person-group>. <article-title>Digital screen time during the COVID-19 pandemic: risk for a further myopia boom?</article-title> <source>Am J Ophthalmol.</source> (<year>2021</year>) <volume>223</volume>:<fpage>333</fpage>&#x02013;<lpage>7</lpage>. <pub-id pub-id-type="doi">10.1016/j.ajo.2020.07.034</pub-id><pub-id pub-id-type="pmid">32738229</pub-id></citation></ref>
<ref id="B33">
<label>33.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wang</surname> <given-names>J</given-names></name> <name><surname>Li</surname> <given-names>Y</given-names></name> <name><surname>Musch</surname> <given-names>DC</given-names></name> <name><surname>Wei</surname> <given-names>N</given-names></name> <name><surname>Qi</surname> <given-names>X</given-names></name> <name><surname>Ding</surname> <given-names>G</given-names></name> <etal/></person-group>. <article-title>Progression of myopia in school-aged children after COVID-19 home confinement</article-title>. <source>JAMA Ophthalmol.</source> (<year>2021</year>) <volume>139</volume>:<fpage>293</fpage>&#x02013;<lpage>300</lpage>. <pub-id pub-id-type="doi">10.1001/jamaophthalmol.2020.6239</pub-id><pub-id pub-id-type="pmid">33443542</pub-id></citation></ref>
<ref id="B34">
<label>34.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Amirsolaimani</surname> <given-names>B</given-names></name> <name><surname>Peyman</surname> <given-names>G</given-names></name> <name><surname>Schwiegerling</surname> <given-names>J</given-names></name> <name><surname>Bablumyan</surname> <given-names>A</given-names></name> <name><surname>Peyghambarian</surname> <given-names>N</given-names></name></person-group>. <article-title>A new low-cost, compact, auto-phoropter for refractive assessment in developing countries</article-title>. <source>Sci Rep.</source> (<year>2017</year>) <volume>7</volume>:<fpage>13990</fpage>. <pub-id pub-id-type="doi">10.1038/s41598-017-14507-5</pub-id><pub-id pub-id-type="pmid">29070904</pub-id></citation></ref>
<ref id="B35">
<label>35.</label>
<citation citation-type="web"><person-group person-group-type="author"><name><surname>Tan</surname> <given-names>T-E</given-names></name> <name><surname>Ting</surname> <given-names>DS</given-names></name> <name><surname>Liu</surname> <given-names>Y</given-names></name> <name><surname>Li</surname> <given-names>S</given-names></name> <name><surname>Chen</surname> <given-names>C</given-names></name> <name><surname>Nguyen</surname> <given-names>Q</given-names></name> <etal/></person-group>. <article-title>Artificial intelligence using a deep learning system with transfer learning to predict refractive error and myopic macular degeneration from color fundus photographs</article-title>. <source>Investig Ophthalmol Vis Sci.</source> (<year>2019</year>) <volume>60</volume>:<fpage>1478</fpage>. Available online at: <ext-link ext-link-type="uri" xlink:href="https://iovs.arvojournals.org/article.aspx?articleid=2745932">https://iovs.arvojournals.org/article.aspx?articleid=2745932</ext-link></citation>
</ref>
<ref id="B36">
<label>36.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Li</surname> <given-names>JO</given-names></name> <name><surname>Liu</surname> <given-names>H</given-names></name> <name><surname>Ting</surname> <given-names>DSJ</given-names></name> <name><surname>Jeon</surname> <given-names>S</given-names></name> <name><surname>Chan</surname> <given-names>RVP</given-names></name> <name><surname>Kim</surname> <given-names>JE</given-names></name> <etal/></person-group>. <article-title>Digital technology, tele-medicine and artificial intelligence in ophthalmology: a global perspective</article-title>. <source>Prog Retin Eye Res.</source> (<year>2021</year>) <volume>82</volume>:<fpage>100900</fpage>. <pub-id pub-id-type="doi">10.1016/j.preteyeres.2020.100900</pub-id><pub-id pub-id-type="pmid">32898686</pub-id></citation></ref>
<ref id="B37">
<label>37.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tan</surname> <given-names>NM</given-names></name> <name><surname>Liu</surname> <given-names>J</given-names></name> <name><surname>Wong</surname> <given-names>DK</given-names></name> <name><surname>Lim</surname> <given-names>JH</given-names></name> <name><surname>Zhang</surname> <given-names>Z</given-names></name> <name><surname>Lu</surname> <given-names>S</given-names></name> <etal/></person-group>. <article-title>Automatic detection of pathological myopia using variational level set</article-title>. <source>Annu Int Conf IEEE Eng Med Biol Soc.</source> (<year>2009</year>) <volume>2009</volume>:<fpage>3609</fpage>&#x02013;<lpage>612</lpage>. <pub-id pub-id-type="doi">10.1109/iembs.2009.5333517</pub-id><pub-id pub-id-type="pmid">19964081</pub-id></citation></ref>
<ref id="B38">
<label>38.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zhang</surname> <given-names>Z</given-names></name> <name><surname>Xu</surname> <given-names>Y</given-names></name> <name><surname>Liu</surname> <given-names>J</given-names></name> <name><surname>Wong</surname> <given-names>DW</given-names></name> <name><surname>Kwoh</surname> <given-names>CK</given-names></name> <name><surname>Saw</surname> <given-names>SM</given-names></name> <etal/></person-group>. <article-title>Automatic diagnosis of pathological myopia from heterogeneous biomedical data</article-title>. <source>PLoS ONE.</source> (<year>2013</year>) <volume>8</volume>:<fpage>e65736</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pone.0065736</pub-id><pub-id pub-id-type="pmid">23799040</pub-id></citation></ref>
<ref id="B39">
<label>39.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hemelings</surname> <given-names>R</given-names></name> <name><surname>Elen</surname> <given-names>B</given-names></name> <name><surname>Blaschko</surname> <given-names>MB</given-names></name> <name><surname>Jacob</surname> <given-names>J</given-names></name> <name><surname>Stalmans</surname> <given-names>I</given-names></name> <name><surname>De Boever</surname> <given-names>P</given-names></name></person-group>. <article-title>Pathological myopia classification with simultaneous lesion segmentation using deep learning</article-title>. <source>Comput Methods Programs Biomed.</source> (<year>2021</year>) <volume>199</volume>:<fpage>105920</fpage>. <pub-id pub-id-type="doi">10.1016/j.cmpb.2020.105920</pub-id><pub-id pub-id-type="pmid">33412285</pub-id></citation></ref>
<ref id="B40">
<label>40.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Du</surname> <given-names>R</given-names></name> <name><surname>Xie</surname> <given-names>S</given-names></name> <name><surname>Fang</surname> <given-names>Y</given-names></name> <name><surname>Igarashi-Yokoi</surname> <given-names>T</given-names></name> <name><surname>Moriyama</surname> <given-names>M</given-names></name> <name><surname>Ogata</surname> <given-names>S</given-names></name> <etal/></person-group>. <article-title>Deep learning approach for automated detection of myopic maculopathy and pathologic myopia in fundus images</article-title>. <source>Ophthalmol Retina.</source> (<year>2021</year>) <volume>5</volume>:<fpage>1235</fpage>&#x02013;<lpage>44</lpage>. <pub-id pub-id-type="doi">10.1016/j.oret.2021.02.006</pub-id><pub-id pub-id-type="pmid">33610832</pub-id></citation></ref>
<ref id="B41">
<label>41.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tan</surname> <given-names>TE</given-names></name> <name><surname>Anees</surname> <given-names>A</given-names></name> <name><surname>Chen</surname> <given-names>C</given-names></name> <name><surname>Li</surname> <given-names>S</given-names></name> <name><surname>Xu</surname> <given-names>X</given-names></name> <name><surname>Li</surname> <given-names>Z</given-names></name> <etal/></person-group>. <article-title>Retinal photograph-based deep learning algorithms for myopia and a blockchain platform to facilitate artificial intelligence medical research: a retrospective multicohort study</article-title>. <source>Lancet Digit Health.</source> (<year>2021</year>) <volume>3</volume>:<fpage>e317</fpage>&#x02013;<lpage>29</lpage>. <pub-id pub-id-type="doi">10.1016/s2589-7500(21)00055-8</pub-id><pub-id pub-id-type="pmid">33890579</pub-id></citation></ref>
<ref id="B42">
<label>42.</label>
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Li</surname> <given-names>Y</given-names></name> <name><surname>Feng</surname> <given-names>W</given-names></name> <name><surname>Zhao</surname> <given-names>X</given-names></name> <name><surname>Liu</surname> <given-names>B</given-names></name> <name><surname>Zhang</surname> <given-names>Y</given-names></name> <name><surname>Chi</surname> <given-names>W</given-names></name> <etal/></person-group>. <article-title>Development and validation of a deep learning system to screen vision-threatening conditions in high myopia using optical coherence tomography images</article-title>. <source>Br J Ophthalmol.</source> (<year>2020</year>). <pub-id pub-id-type="doi">10.1136/bjophthalmol-2020-317825</pub-id><pub-id pub-id-type="pmid">33355150</pub-id></citation></ref>
<ref id="B43">
<label>43.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sogawa</surname> <given-names>T</given-names></name> <name><surname>Tabuchi</surname> <given-names>H</given-names></name> <name><surname>Nagasato</surname> <given-names>D</given-names></name> <name><surname>Masumoto</surname> <given-names>H</given-names></name> <name><surname>Ikuno</surname> <given-names>Y</given-names></name> <name><surname>Ohsugi</surname> <given-names>H</given-names></name> <etal/></person-group>. <article-title>Accuracy of a deep convolutional neural network in the detection of myopic macular diseases using swept-source optical coherence tomography</article-title>. <source>PLoS ONE.</source> (<year>2020</year>) <volume>15</volume>:<fpage>e0227240</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pone.0227240</pub-id><pub-id pub-id-type="pmid">32298265</pub-id></citation></ref>
<ref id="B44">
<label>44.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Foo</surname> <given-names>LL</given-names></name> <name><surname>Ang</surname> <given-names>M</given-names></name> <name><surname>Wong</surname> <given-names>CW</given-names></name> <name><surname>Ohno-Matsui</surname> <given-names>K</given-names></name> <name><surname>Saw</surname> <given-names>SM</given-names></name> <name><surname>Wong</surname> <given-names>TY</given-names></name> <etal/></person-group>. <article-title>Is artificial intelligence a solution to the myopia pandemic?</article-title> <source>Br J Ophthalmol.</source> (<year>2021</year>) <volume>105</volume>:<fpage>741</fpage>&#x02013;<lpage>4</lpage>. <pub-id pub-id-type="doi">10.1136/bjophthalmol-2021-319129</pub-id><pub-id pub-id-type="pmid">33712483</pub-id></citation></ref>
<ref id="B45">
<label>45.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lin</surname> <given-names>H</given-names></name> <name><surname>Long</surname> <given-names>E</given-names></name> <name><surname>Ding</surname> <given-names>X</given-names></name> <name><surname>Diao</surname> <given-names>H</given-names></name> <name><surname>Chen</surname> <given-names>Z</given-names></name> <name><surname>Liu</surname> <given-names>R</given-names></name> <etal/></person-group>. <article-title>Prediction of myopia development among Chinese school-aged children using refraction data from electronic medical records: a retrospective, multicentre machine learning study</article-title>. <source>PLoS Med.</source> (<year>2018</year>) <volume>15</volume>:<fpage>e1002674</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pmed.1002674</pub-id><pub-id pub-id-type="pmid">30399150</pub-id></citation></ref>
<ref id="B46">
<label>46.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yang</surname> <given-names>X</given-names></name> <name><surname>Chen</surname> <given-names>G</given-names></name> <name><surname>Qian</surname> <given-names>Y</given-names></name> <name><surname>Wang</surname> <given-names>Y</given-names></name> <name><surname>Zhai</surname> <given-names>Y</given-names></name> <name><surname>Fan</surname> <given-names>D</given-names></name> <etal/></person-group>. <article-title>Prediction of myopia in adolescents through machine learning methods</article-title>. <source>Int J Environ Res Public Health.</source> (<year>2020</year>) <volume>17</volume>:<fpage>463</fpage>. <pub-id pub-id-type="doi">10.3390/ijerph17020463</pub-id><pub-id pub-id-type="pmid">31936770</pub-id></citation></ref>
<ref id="B47">
<label>47.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Seiler</surname> <given-names>T</given-names></name> <name><surname>Koufala</surname> <given-names>K</given-names></name> <name><surname>Richter</surname> <given-names>G</given-names></name></person-group>. <article-title>Iatrogenic keratectasia after laser <italic>in situ</italic> keratomileusis</article-title>. <source>J Refract Surg.</source> (<year>1998</year>) <volume>14</volume>:<fpage>312</fpage>&#x02013;<lpage>7</lpage>. <pub-id pub-id-type="doi">10.3928/1081-597X-19980501-15</pub-id><pub-id pub-id-type="pmid">9641422</pub-id></citation></ref>
<ref id="B48">
<label>48.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ambr&#x000F3;sio</surname> <given-names>R</given-names> <suffix>Jr</suffix></name></person-group>. <article-title>Post-LASIK Ectasia: twenty years of a conundrum</article-title>. <source>Semin Ophthalmol</source>. (<year>2019</year>) <volume>34</volume>:<fpage>66</fpage>&#x02013;<lpage>8</lpage>. <pub-id pub-id-type="doi">10.1080/08820538.2019.1569075</pub-id><pub-id pub-id-type="pmid">30664391</pub-id></citation></ref>
<ref id="B49">
<label>49.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Xie</surname> <given-names>Y</given-names></name> <name><surname>Zhao</surname> <given-names>L</given-names></name> <name><surname>Yang</surname> <given-names>X</given-names></name> <name><surname>Wu</surname> <given-names>X</given-names></name> <name><surname>Yang</surname> <given-names>Y</given-names></name> <name><surname>Huang</surname> <given-names>X</given-names></name> <etal/></person-group>. <article-title>Screening candidates for refractive surgery with corneal tomographic-based deep learning</article-title>. <source>JAMA Ophthalmol.</source> (<year>2020</year>) <volume>138</volume>:<fpage>519</fpage>&#x02013;<lpage>26</lpage>. <pub-id pub-id-type="doi">10.1001/jamaophthalmol.2020.0507</pub-id><pub-id pub-id-type="pmid">32215587</pub-id></citation></ref>
<ref id="B50">
<label>50.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yoo</surname> <given-names>TK</given-names></name> <name><surname>Ryu</surname> <given-names>IH</given-names></name> <name><surname>Choi</surname> <given-names>H</given-names></name> <name><surname>Kim</surname> <given-names>JK</given-names></name> <name><surname>Lee</surname> <given-names>IS</given-names></name> <name><surname>Kim</surname> <given-names>JS</given-names></name> <etal/></person-group>. <article-title>Explainable machine learning approach as a tool to understand factors used to select the refractive surgery technique on the expert level</article-title>. <source>Transl Vis Sci Technol.</source> (<year>2020</year>) <volume>9</volume>:<fpage>8</fpage>. <pub-id pub-id-type="doi">10.1167/tvst.9.2.8</pub-id><pub-id pub-id-type="pmid">32704414</pub-id></citation></ref>
<ref id="B51">
<label>51.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cui</surname> <given-names>T</given-names></name> <name><surname>Wang</surname> <given-names>Y</given-names></name> <name><surname>Ji</surname> <given-names>S</given-names></name> <name><surname>Li</surname> <given-names>Y</given-names></name> <name><surname>Hao</surname> <given-names>W</given-names></name> <name><surname>Zou</surname> <given-names>H</given-names></name> <etal/></person-group>. <article-title>Applying machine learning techniques in nomogram prediction and analysis for SMILE treatment</article-title>. <source>Am J Ophthalmol.</source> (<year>2020</year>) <volume>210</volume>:<fpage>71</fpage>&#x02013;<lpage>7</lpage>. <pub-id pub-id-type="doi">10.1016/j.ajo.2019.10.015</pub-id><pub-id pub-id-type="pmid">31647929</pub-id></citation></ref>
<ref id="B52">
<label>52.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Guerra</surname> <given-names>MG</given-names></name> <name><surname>Silva</surname> <given-names>AMM</given-names></name> <name><surname>Marques</surname> <given-names>SHM</given-names></name> <name><surname>Melo</surname> <given-names>SH</given-names></name> <name><surname>P&#x000F3;voa</surname> <given-names>JA</given-names></name> <name><surname>Lobo</surname> <given-names>C</given-names></name> <etal/></person-group>. <article-title>Phakic intraocular lens implantation: refractive outcome and safety in patients with anterior chamber depth between 2.8 and 3.0 versus &#x02265;3.0 mm</article-title>. <source>Ophthalmic Res.</source> (<year>2017</year>) <volume>57</volume>:<fpage>239</fpage>&#x02013;<lpage>46</lpage>. <pub-id pub-id-type="doi">10.1159/000453528</pub-id><pub-id pub-id-type="pmid">28199995</pub-id></citation></ref>
<ref id="B53">
<label>53.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Chen</surname> <given-names>D</given-names></name> <name><surname>Ho</surname> <given-names>Y</given-names></name> <name><surname>Sasa</surname> <given-names>Y</given-names></name> <name><surname>Lee</surname> <given-names>J</given-names></name> <name><surname>Yen</surname> <given-names>CC</given-names></name> <name><surname>Tan</surname> <given-names>C</given-names></name></person-group>. <article-title>Machine learning-guided prediction of central anterior chamber depth using slit lamp images from a portable smartphone device</article-title>. <source>Biosensors.</source> (<year>2021</year>) <volume>11</volume>:<fpage>812</fpage>. <pub-id pub-id-type="doi">10.3390/bios11060182</pub-id><pub-id pub-id-type="pmid">34198935</pub-id></citation></ref>
<ref id="B54">
<label>54.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Xia</surname> <given-names>T</given-names></name> <name><surname>Martinez</surname> <given-names>CE</given-names></name> <name><surname>Tsai</surname> <given-names>LM</given-names></name></person-group>. <article-title>Update on intraocular lens formulas and calculations</article-title>. <source>Asia Pac J Ophthalmol.</source> (<year>2020</year>) <volume>9</volume>:<fpage>186</fpage>&#x02013;<lpage>93</lpage>. <pub-id pub-id-type="doi">10.1097/apo.0000000000000293</pub-id><pub-id pub-id-type="pmid">32501896</pub-id></citation></ref>
<ref id="B55">
<label>55.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Melles</surname> <given-names>RB</given-names></name> <name><surname>Holladay</surname> <given-names>JT</given-names></name> <name><surname>Chang</surname> <given-names>WJ</given-names></name></person-group>. <article-title>Accuracy of intraocular lens calculation formulas</article-title>. <source>Ophthalmology.</source> (<year>2018</year>) <volume>125</volume>:<fpage>169</fpage>&#x02013;<lpage>78</lpage>. <pub-id pub-id-type="doi">10.1016/j.ophtha.2017.08.027</pub-id><pub-id pub-id-type="pmid">28951074</pub-id></citation></ref>
<ref id="B56">
<label>56.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sramka</surname> <given-names>M</given-names></name> <name><surname>Slovak</surname> <given-names>M</given-names></name> <name><surname>Tuckova</surname> <given-names>J</given-names></name> <name><surname>Stodulka</surname> <given-names>P</given-names></name></person-group>. <article-title>Improving clinical refractive results of cataract surgery by machine learning</article-title>. <source>PeerJ.</source> (<year>2019</year>) <volume>7</volume>:<fpage>e7202</fpage>. <pub-id pub-id-type="doi">10.7717/peerj.7202</pub-id><pub-id pub-id-type="pmid">31304064</pub-id></citation></ref>
<ref id="B57">
<label>57.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wan</surname> <given-names>KH</given-names></name> <name><surname>Lam</surname> <given-names>TCH</given-names></name> <name><surname>Yu</surname> <given-names>MCY</given-names></name> <name><surname>Chan</surname> <given-names>TCY</given-names></name></person-group>. <article-title>Accuracy and precision of intraocular lens calculations using the new Hill-RBF version 2.0 in eyes with high axial myopia</article-title>. <source>Am J Ophthalmol.</source> (<year>2019</year>) <volume>205</volume>:<fpage>66</fpage>&#x02013;<lpage>3</lpage>. <pub-id pub-id-type="doi">10.1016/j.ajo.2019.04.019</pub-id><pub-id pub-id-type="pmid">31078534</pub-id></citation></ref>
<ref id="B58">
<label>58.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Savini</surname> <given-names>G</given-names></name> <name><surname>Di Maita</surname> <given-names>M</given-names></name> <name><surname>Hoffer</surname> <given-names>KJ</given-names></name> <name><surname>N&#x000E6;ser</surname> <given-names>K</given-names></name> <name><surname>Schiano-Lomoriello</surname> <given-names>D</given-names></name> <name><surname>Vagge</surname> <given-names>A</given-names></name> <etal/></person-group>. <article-title>Comparison of 13 formulas for IOL power calculation with measurements from partial coherence interferometry</article-title>. <source>Br J Ophthalmol.</source> (<year>2021</year>) <volume>105</volume>:<fpage>484</fpage>&#x02013;<lpage>9</lpage>. <pub-id pub-id-type="doi">10.1136/bjophthalmol-2020-316193</pub-id><pub-id pub-id-type="pmid">32522789</pub-id></citation></ref>
<ref id="B59">
<label>59.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kane</surname> <given-names>JX</given-names></name> <name><surname>Van Heerden</surname> <given-names>A</given-names></name> <name><surname>Atik</surname> <given-names>A</given-names></name> <name><surname>Petsoglou</surname> <given-names>C</given-names></name></person-group>. <article-title>Accuracy of 3 new methods for intraocular lens power selection</article-title>. <source>J Cataract Refract Surg.</source> (<year>2017</year>) <volume>43</volume>:<fpage>333</fpage>&#x02013;<lpage>9</lpage>. <pub-id pub-id-type="doi">10.1016/j.jcrs.2016.12.021</pub-id><pub-id pub-id-type="pmid">28410714</pub-id></citation></ref>
<ref id="B60">
<label>60.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Darcy</surname> <given-names>K</given-names></name> <name><surname>Gunn</surname> <given-names>D</given-names></name> <name><surname>Tavassoli</surname> <given-names>S</given-names></name> <name><surname>Sparrow</surname> <given-names>J</given-names></name> <name><surname>Kane</surname> <given-names>JX</given-names></name></person-group>. <article-title>Assessment of the accuracy of new and updated intraocular lens power calculation formulas in 10 930 eyes from the UK National Health Service</article-title>. <source>J Cataract Refract Surg.</source> (<year>2020</year>) <volume>46</volume>:<fpage>2</fpage>&#x02013;<lpage>7</lpage>. <pub-id pub-id-type="doi">10.1016/j.jcrs.2019.08.014</pub-id><pub-id pub-id-type="pmid">32050225</pub-id></citation></ref>
<ref id="B61">
<label>61.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Connell</surname> <given-names>BJ</given-names></name> <name><surname>Kane</surname> <given-names>JX</given-names></name></person-group>. <article-title>Comparison of the Kane formula with existing formulas for intraocular lens power selection</article-title>. <source>BMJ Open Ophthalmol.</source> (<year>2019</year>) <volume>4</volume>:<fpage>e000251</fpage>. <pub-id pub-id-type="doi">10.1136/bmjophth-2018-000251</pub-id><pub-id pub-id-type="pmid">31179396</pub-id></citation></ref>
<ref id="B62">
<label>62.</label>
<citation citation-type="web"><person-group person-group-type="author"><name><surname>Mrochen</surname> <given-names>M</given-names></name> <name><surname>Zakharov</surname> <given-names>P</given-names></name> <name><surname>Tabakc&#x003B9;</surname> <given-names>BN</given-names></name> <name><surname>Tanr&#x003B9;verdi</surname> <given-names>C</given-names></name> <name><surname>K&#x003B9;l&#x003B9;&#x000E7;</surname> <given-names>A</given-names></name> <name><surname>Flitcroft</surname> <given-names>DI</given-names></name></person-group>. <article-title>Visual lifestyle of myopic children assessed with AI-powered wearable monitoring</article-title>. <source>Investig Ophthalmol Vis Sci.</source> (<year>2020</year>) <volume>61</volume>:<fpage>82</fpage>. Available online at: <ext-link ext-link-type="uri" xlink:href="https://iovs.arvojournals.org/article.aspx?articleid=2766581">https://iovs.arvojournals.org/article.aspx?articleid=2766581</ext-link></citation>
</ref>
<ref id="B63">
<label>63.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rose</surname> <given-names>KA</given-names></name> <name><surname>Morgan</surname> <given-names>IG</given-names></name> <name><surname>Ip</surname> <given-names>J</given-names></name> <name><surname>Kifley</surname> <given-names>A</given-names></name> <name><surname>Huynh</surname> <given-names>S</given-names></name> <name><surname>Smith</surname> <given-names>W</given-names></name> <etal/></person-group>. <article-title>Outdoor activity reduces the prevalence of myopia in children</article-title>. <source>Ophthalmology.</source> (<year>2008</year>) <volume>115</volume>:<fpage>1279</fpage>&#x02013;<lpage>85</lpage>. <pub-id pub-id-type="doi">10.1016/j.ophtha.2007.12.019</pub-id><pub-id pub-id-type="pmid">18294691</pub-id></citation></ref>
<ref id="B64">
<label>64.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sherwin</surname> <given-names>JC</given-names></name> <name><surname>Reacher</surname> <given-names>MH</given-names></name> <name><surname>Keogh</surname> <given-names>RH</given-names></name> <name><surname>Khawaja</surname> <given-names>AP</given-names></name> <name><surname>Mackey</surname> <given-names>DA</given-names></name> <name><surname>Foster</surname> <given-names>PJ</given-names></name></person-group>. <article-title>The association between time spent outdoors and myopia in children and adolescents: a systematic review and meta-analysis</article-title>. <source>Ophthalmology.</source> (<year>2012</year>) <volume>119</volume>:<fpage>2141</fpage>&#x02013;<lpage>51</lpage>. <pub-id pub-id-type="doi">10.1016/j.ophtha.2012.04.020</pub-id><pub-id pub-id-type="pmid">22809757</pub-id></citation></ref>
<ref id="B65">
<label>65.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Verkicharla</surname> <given-names>PK</given-names></name> <name><surname>Ramamurthy</surname> <given-names>D</given-names></name> <name><surname>Nguyen</surname> <given-names>QD</given-names></name> <name><surname>Zhang</surname> <given-names>X</given-names></name> <name><surname>Pu</surname> <given-names>SH</given-names></name> <name><surname>Malhotra</surname> <given-names>R</given-names></name> <etal/></person-group>. <article-title>Development of the FitSight fitness tracker to increase time outdoors to prevent myopia</article-title>. <source>Transl Vis Sci Technol.</source> (<year>2017</year>) <volume>6</volume>:<fpage>20</fpage>. <pub-id pub-id-type="doi">10.1167/tvst.6.3.20</pub-id><pub-id pub-id-type="pmid">28660095</pub-id></citation></ref>
<ref id="B66">
<label>66.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lin</surname> <given-names>Z</given-names></name> <name><surname>Vasudevan</surname> <given-names>B</given-names></name> <name><surname>Mao</surname> <given-names>GY</given-names></name> <name><surname>Ciuffreda</surname> <given-names>KJ</given-names></name> <name><surname>Jhanji</surname> <given-names>V</given-names></name> <name><surname>Li</surname> <given-names>XX</given-names></name> <etal/></person-group>. <article-title>The influence of near work on myopic refractive change in urban students in Beijing: a three-year follow-up report</article-title>. <source>Graefes Arch Clin Exp Ophthalmol.</source> (<year>2016</year>) <volume>254</volume>:<fpage>2247</fpage>&#x02013;<lpage>55</lpage>. <pub-id pub-id-type="doi">10.1007/s00417-016-3440-9</pub-id><pub-id pub-id-type="pmid">27460281</pub-id></citation></ref>
<ref id="B67">
<label>67.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sun</surname> <given-names>JT</given-names></name> <name><surname>An</surname> <given-names>M</given-names></name> <name><surname>Yan</surname> <given-names>XB</given-names></name> <name><surname>Li</surname> <given-names>GH</given-names></name> <name><surname>Wang</surname> <given-names>DB</given-names></name></person-group>. <article-title>Prevalence and related factors for myopia in school-aged children in Qingdao</article-title>. <source>J Ophthalmol.</source> (<year>2018</year>) <volume>2018</volume>:<fpage>9781987</fpage>. <pub-id pub-id-type="doi">10.1155/2018/9781987</pub-id><pub-id pub-id-type="pmid">29507811</pub-id></citation></ref>
<ref id="B68">
<label>68.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cao</surname> <given-names>Y</given-names></name> <name><surname>Lan</surname> <given-names>W</given-names></name> <name><surname>Wen</surname> <given-names>L</given-names></name> <name><surname>Li</surname> <given-names>X</given-names></name> <name><surname>Pan</surname> <given-names>L</given-names></name> <name><surname>Wang</surname> <given-names>X</given-names></name> <etal/></person-group>. <article-title>An effectiveness study of a wearable device (Clouclip) intervention in unhealthy visual behaviors among school-age children: a pilot study</article-title>. <source>Medicine.</source> (<year>2020</year>) <volume>99</volume>:<fpage>e17992</fpage>. <pub-id pub-id-type="doi">10.1097/md.0000000000017992</pub-id><pub-id pub-id-type="pmid">31914011</pub-id></citation></ref>
<ref id="B69">
<label>69.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wen</surname> <given-names>L</given-names></name> <name><surname>Cheng</surname> <given-names>Q</given-names></name> <name><surname>Lan</surname> <given-names>W</given-names></name> <name><surname>Cao</surname> <given-names>Y</given-names></name> <name><surname>Li</surname> <given-names>X</given-names></name> <name><surname>Lu</surname> <given-names>Y</given-names></name> <etal/></person-group>. <article-title>An objective comparison of light intensity and near-visual tasks between rural and urban school children in China by a wearable device Clouclip</article-title>. <source>Transl Vis Sci Technol.</source> (<year>2019</year>) <volume>8</volume>:<fpage>15</fpage>. <pub-id pub-id-type="doi">10.1167/tvst.8.6.15</pub-id><pub-id pub-id-type="pmid">31772826</pub-id></citation></ref>
<ref id="B70">
<label>70.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Verhoeven</surname> <given-names>VJ</given-names></name> <name><surname>Hysi</surname> <given-names>PG</given-names></name> <name><surname>Wojciechowski</surname> <given-names>R</given-names></name> <name><surname>Fan</surname> <given-names>Q</given-names></name> <name><surname>Guggenheim</surname> <given-names>JA</given-names></name> <name><surname>H&#x000F6;hn</surname> <given-names>R</given-names></name> <etal/></person-group>. <article-title>Genome-wide meta-analyses of multiancestry cohorts identify multiple new susceptibility loci for refractive error and myopia</article-title>. <source>Nat Genet.</source> (<year>2013</year>) <volume>45</volume>:<fpage>314</fpage>&#x02013;<lpage>8</lpage>. <pub-id pub-id-type="doi">10.1038/ng.2554</pub-id><pub-id pub-id-type="pmid">23396134</pub-id></citation></ref>
<ref id="B71">
<label>71.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fan</surname> <given-names>Q</given-names></name> <name><surname>Barathi</surname> <given-names>VA</given-names></name> <name><surname>Cheng</surname> <given-names>CY</given-names></name> <name><surname>Zhou</surname> <given-names>X</given-names></name> <name><surname>Meguro</surname> <given-names>A</given-names></name> <name><surname>Nakata</surname> <given-names>I</given-names></name> <etal/></person-group>. <article-title>Genetic variants on chromosome 1q41 influence ocular axial length and high myopia</article-title>. <source>PLoS Genet.</source> (<year>2012</year>) <volume>8</volume>:<fpage>e1002753</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pgen.1002753</pub-id><pub-id pub-id-type="pmid">22685421</pub-id></citation></ref>
<ref id="B72">
<label>72.</label>
<citation citation-type="web"><person-group person-group-type="author"><name><surname>Li</surname> <given-names>J</given-names></name> <name><surname>Zhang</surname> <given-names>Q</given-names></name></person-group>. <article-title>Insight into the molecular genetics of myopia</article-title>. <source>Mol Vis.</source> (<year>2017</year>) <volume>23</volume>:<fpage>1048</fpage>&#x02013;<lpage>80</lpage>. Available online at: <ext-link ext-link-type="uri" xlink:href="http://www.molvis.org/molvis/v23/1048/">http://www.molvis.org/molvis/v23/1048/</ext-link></citation>
</ref>
<ref id="B73">
<label>73.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cai</surname> <given-names>XB</given-names></name> <name><surname>Shen</surname> <given-names>SR</given-names></name> <name><surname>Chen</surname> <given-names>DF</given-names></name> <name><surname>Zhang</surname> <given-names>Q</given-names></name> <name><surname>Jin</surname> <given-names>ZB</given-names></name></person-group>. <article-title>An overview of myopia genetics</article-title>. <source>Exp Eye Res.</source> (<year>2019</year>) <volume>188</volume>:<fpage>107778</fpage>. <pub-id pub-id-type="doi">10.1016/j.exer.2019.107778</pub-id><pub-id pub-id-type="pmid">31472110</pub-id></citation></ref>
<ref id="B74">
<label>74.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Williams</surname> <given-names>AM</given-names></name> <name><surname>Liu</surname> <given-names>Y</given-names></name> <name><surname>Regner</surname> <given-names>KR</given-names></name> <name><surname>Jotterand</surname> <given-names>F</given-names></name> <name><surname>Liu</surname> <given-names>P</given-names></name> <name><surname>Liang</surname> <given-names>M</given-names></name></person-group>. <article-title>Artificial intelligence, physiological genomics, and precision medicine</article-title>. <source>Physiol Genomics.</source> (<year>2018</year>) <volume>50</volume>:<fpage>237</fpage>&#x02013;<lpage>43</lpage>. <pub-id pub-id-type="doi">10.1152/physiolgenomics.00119.2017</pub-id><pub-id pub-id-type="pmid">29373082</pub-id></citation></ref>
<ref id="B75">
<label>75.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Xu</surname> <given-names>J</given-names></name> <name><surname>Yang</surname> <given-names>P</given-names></name> <name><surname>Xue</surname> <given-names>S</given-names></name> <name><surname>Sharma</surname> <given-names>B</given-names></name> <name><surname>Sanchez-Martin</surname> <given-names>M</given-names></name> <name><surname>Wang</surname> <given-names>F</given-names></name> <etal/></person-group>. <article-title>Translating cancer genomics into precision medicine with artificial intelligence: applications, challenges and future perspectives</article-title>. <source>Hum Genet.</source> (<year>2019</year>) <volume>138</volume>:<fpage>109</fpage>&#x02013;<lpage>24</lpage>. <pub-id pub-id-type="doi">10.1007/s00439-019-01970-5</pub-id><pub-id pub-id-type="pmid">30671672</pub-id></citation></ref>
<ref id="B76">
<label>76.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dias</surname> <given-names>R</given-names></name> <name><surname>Torkamani</surname> <given-names>A</given-names></name></person-group>. <article-title>Artificial intelligence in clinical and genomic diagnostics</article-title>. <source>Genome Med.</source> (<year>2019</year>) <volume>11</volume>:<fpage>70</fpage>. <pub-id pub-id-type="doi">10.1186/s13073-019-0689-8</pub-id><pub-id pub-id-type="pmid">31744524</pub-id></citation></ref>
<ref id="B77">
<label>77.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Foo</surname> <given-names>LL</given-names></name> <name><surname>Ng</surname> <given-names>WY</given-names></name> <name><surname>Lim</surname> <given-names>GYS</given-names></name> <name><surname>Tan</surname> <given-names>TE</given-names></name> <name><surname>Ang</surname> <given-names>M</given-names></name> <name><surname>Ting</surname> <given-names>DSW</given-names></name></person-group>. <article-title>Artificial intelligence in myopia: current and future trends</article-title>. <source>Curr Opin Ophthalmol.</source> (<year>2021</year>) <volume>32</volume>:<fpage>413</fpage>&#x02013;<lpage>24</lpage>. <pub-id pub-id-type="doi">10.1097/icu.0000000000000791</pub-id><pub-id pub-id-type="pmid">34310401</pub-id></citation></ref>
<ref id="B78">
<label>78.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Waller</surname> <given-names>M</given-names></name> <name><surname>Stotler</surname> <given-names>C</given-names></name></person-group>. <article-title>Telemedicine: a Primer</article-title>. <source>Curr Allergy Asthma Rep.</source> (<year>2018</year>) <volume>18</volume>:<fpage>54</fpage>. <pub-id pub-id-type="doi">10.1007/s11882-018-0808-4</pub-id><pub-id pub-id-type="pmid">30145709</pub-id></citation></ref>
<ref id="B79">
<label>79.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hollander</surname> <given-names>JE</given-names></name> <name><surname>Carr</surname> <given-names>BG</given-names></name></person-group>. <article-title>Virtually perfect? Telemedicine for Covid-19</article-title>. <source>N Engl J Med.</source> (<year>2020</year>) <volume>382</volume>:<fpage>1679</fpage>&#x02013;<lpage>81</lpage>. <pub-id pub-id-type="doi">10.1056/NEJMp2003539</pub-id><pub-id pub-id-type="pmid">32160451</pub-id></citation></ref>
<ref id="B80">
<label>80.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ye</surname> <given-names>Y</given-names></name> <name><surname>Wang</surname> <given-names>J</given-names></name> <name><surname>Xie</surname> <given-names>Y</given-names></name> <name><surname>Zhong</surname> <given-names>J</given-names></name> <name><surname>Hu</surname> <given-names>Y</given-names></name> <name><surname>Chen</surname> <given-names>B</given-names></name> <etal/></person-group>. <article-title>Global teleophthalmology with iPhones for real-time slitlamp eye examination</article-title>. <source>Eye Contact Lens.</source> (<year>2014</year>) <volume>40</volume>:<fpage>297</fpage>&#x02013;<lpage>300</lpage>. <pub-id pub-id-type="doi">10.1097/icl.0000000000000051</pub-id><pub-id pub-id-type="pmid">25083779</pub-id></citation></ref>
<ref id="B81">
<label>81.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ghazala</surname> <given-names>FR</given-names></name> <name><surname>Hamilton</surname> <given-names>R</given-names></name> <name><surname>Giardini</surname> <given-names>ME</given-names></name> <name><surname>Livingstone</surname> <given-names>IAT</given-names></name></person-group>. <article-title>Teleophthalmology techniques increase ophthalmic examination distance</article-title>. <source>Eye.</source> (<year>2021</year>) <volume>35</volume>:<fpage>1780</fpage>&#x02013;<lpage>1</lpage>. <pub-id pub-id-type="doi">10.1038/s41433-020-1085-8</pub-id><pub-id pub-id-type="pmid">32678348</pub-id></citation></ref>
<ref id="B82">
<label>82.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wu</surname> <given-names>X</given-names></name> <name><surname>Huang</surname> <given-names>Y</given-names></name> <name><surname>Liu</surname> <given-names>Z</given-names></name> <name><surname>Lai</surname> <given-names>W</given-names></name> <name><surname>Long</surname> <given-names>E</given-names></name> <name><surname>Zhang</surname> <given-names>K</given-names></name> <etal/></person-group>. <article-title>Universal artificial intelligence platform for collaborative management of cataracts</article-title>. <source>Br J Ophthalmol.</source> (<year>2019</year>) <volume>103</volume>:<fpage>1553</fpage>&#x02013;<lpage>60</lpage>. <pub-id pub-id-type="doi">10.1136/bjophthalmol-2019-314729</pub-id><pub-id pub-id-type="pmid">31481392</pub-id></citation></ref>
<ref id="B83">
<label>83.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Badano</surname> <given-names>A</given-names></name> <name><surname>Graff</surname> <given-names>CG</given-names></name> <name><surname>Badal</surname> <given-names>A</given-names></name> <name><surname>Sharma</surname> <given-names>D</given-names></name> <name><surname>Zeng</surname> <given-names>R</given-names></name> <name><surname>Samuelson</surname> <given-names>FW</given-names></name> <etal/></person-group>. <article-title>Evaluation of digital breast tomosynthesis as replacement of full-field digital mammography using an <italic>in silico</italic> imaging trial</article-title>. <source>JAMA Netw Open.</source> (<year>2018</year>) <volume>1</volume>:<fpage>e185474</fpage>. <pub-id pub-id-type="doi">10.1001/jamanetworkopen.2018.5474</pub-id><pub-id pub-id-type="pmid">30646401</pub-id></citation></ref>
<ref id="B84">
<label>84.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cha</surname> <given-names>KH</given-names></name> <name><surname>Petrick</surname> <given-names>N</given-names></name> <name><surname>Pezeshk</surname> <given-names>A</given-names></name> <name><surname>Graff</surname> <given-names>CG</given-names></name> <name><surname>Sharma</surname> <given-names>D</given-names></name> <name><surname>Badal</surname> <given-names>A</given-names></name> <etal/></person-group>. <article-title>Evaluation of data augmentation via synthetic images for improved breast mass detection on mammograms using deep learning</article-title>. <source>J Med Imaging.</source> (<year>2020</year>) <volume>7</volume>:<fpage>012703</fpage>. <pub-id pub-id-type="doi">10.1117/1.Jmi.7.1.012703</pub-id><pub-id pub-id-type="pmid">31763356</pub-id></citation></ref>
<ref id="B85">
<label>85.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Maspero</surname> <given-names>M</given-names></name> <name><surname>Savenije</surname> <given-names>MHF</given-names></name> <name><surname>Dinkla</surname> <given-names>AM</given-names></name> <name><surname>Seevinck</surname> <given-names>PR</given-names></name> <name><surname>Intven</surname> <given-names>MPW</given-names></name> <name><surname>Jurgenliemk-Schulz</surname> <given-names>IM</given-names></name> <etal/></person-group>. <article-title>Dose evaluation of fast synthetic-CT generation using a generative adversarial network for general pelvis MR-only radiotherapy</article-title>. <source>Phys Med Biol.</source> (<year>2018</year>) <volume>63</volume>:<fpage>185001</fpage>. <pub-id pub-id-type="doi">10.1088/1361-6560/aada6d</pub-id><pub-id pub-id-type="pmid">30109989</pub-id></citation></ref>
<ref id="B86">
<label>86.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jin</surname> <given-names>CB</given-names></name> <name><surname>Kim</surname> <given-names>H</given-names></name> <name><surname>Liu</surname> <given-names>M</given-names></name> <name><surname>Jung</surname> <given-names>W</given-names></name> <name><surname>Joo</surname> <given-names>S</given-names></name> <name><surname>Park</surname> <given-names>E</given-names></name> <etal/></person-group>. <article-title>Deep CT to MR synthesis using paired and unpaired data</article-title>. <source>Sensors.</source> (<year>2019</year>) <volume>19</volume>:<fpage>2361</fpage>. <pub-id pub-id-type="doi">10.3390/s19102361</pub-id><pub-id pub-id-type="pmid">31121961</pub-id></citation></ref>
<ref id="B87">
<label>87.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yang</surname> <given-names>Q</given-names></name> <name><surname>Yan</surname> <given-names>P</given-names></name> <name><surname>Zhang</surname> <given-names>Y</given-names></name> <name><surname>Yu</surname> <given-names>H</given-names></name> <name><surname>Shi</surname> <given-names>Y</given-names></name> <name><surname>Mou</surname> <given-names>X</given-names></name> <etal/></person-group>. <article-title>Low-Dose CT image denoising using a generative adversarial network with Wasserstein distance and perceptual loss</article-title>. <source>IEEE Trans Med Imaging.</source> (<year>2018</year>) <volume>37</volume>:<fpage>1348</fpage>&#x02013;<lpage>57</lpage>. <pub-id pub-id-type="doi">10.1109/tmi.2018.2827462</pub-id><pub-id pub-id-type="pmid">29870364</pub-id></citation></ref>
<ref id="B88">
<label>88.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yi</surname> <given-names>X</given-names></name> <name><surname>Walia</surname> <given-names>E</given-names></name> <name><surname>Babyn</surname> <given-names>P</given-names></name></person-group>. <article-title>Generative adversarial network in medical imaging: a review</article-title>. <source>Med Image Anal.</source> (<year>2019</year>) <volume>58</volume>:<fpage>101552</fpage>. <pub-id pub-id-type="doi">10.1016/j.media.2019.101552</pub-id><pub-id pub-id-type="pmid">31521965</pub-id></citation></ref>
<ref id="B89">
<label>89.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lee</surname> <given-names>JG</given-names></name> <name><surname>Jun</surname> <given-names>S</given-names></name> <name><surname>Cho</surname> <given-names>YW</given-names></name> <name><surname>Lee</surname> <given-names>H</given-names></name> <name><surname>Kim</surname> <given-names>GB</given-names></name> <name><surname>Seo</surname> <given-names>JB</given-names></name> <etal/></person-group>. <article-title>Deep learning in medical imaging: general overview</article-title>. <source>Korean J Radiol.</source> (<year>2017</year>) <volume>18</volume>:<fpage>570</fpage>&#x02013;<lpage>84</lpage>. <pub-id pub-id-type="doi">10.3348/kjr.2017.18.4.570</pub-id><pub-id pub-id-type="pmid">28670152</pub-id></citation></ref>
<ref id="B90">
<label>90.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Phan</surname> <given-names>S</given-names></name> <name><surname>Satoh</surname> <given-names>S</given-names></name> <name><surname>Yoda</surname> <given-names>Y</given-names></name> <name><surname>Kashiwagi</surname> <given-names>K</given-names></name> <name><surname>Oshika</surname> <given-names>T</given-names></name></person-group>. <article-title>Evaluation of deep convolutional neural networks for glaucoma detection</article-title>. <source>Jpn J Ophthalmol.</source> (<year>2019</year>) <volume>63</volume>:<fpage>276</fpage>&#x02013;<lpage>83</lpage>. <pub-id pub-id-type="doi">10.1007/s10384-019-00659-6</pub-id><pub-id pub-id-type="pmid">30798379</pub-id></citation></ref>
<ref id="B91">
<label>91.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Stead</surname> <given-names>WW</given-names></name></person-group>. <article-title>Clinical implications and challenges of artificial intelligence and deep learning</article-title>. <source>JAMA.</source> (<year>2018</year>) <volume>320</volume>:<fpage>1107</fpage>&#x02013;<lpage>8</lpage>. <pub-id pub-id-type="doi">10.1001/jama.2018.11029</pub-id><pub-id pub-id-type="pmid">30178025</pub-id></citation></ref>
<ref id="B92">
<label>92.</label>
<citation citation-type="web"><person-group person-group-type="author"><name><surname>Zeiler</surname> <given-names>MD</given-names></name> <name><surname>Fergus</surname> <given-names>R</given-names></name></person-group>. <article-title>Visualizing and understanding convolutional networks</article-title>. <source>arXiv [Preprint]</source>.(<year>2010</year>). arXiv: 1311.2901v3. Available online at: <ext-link ext-link-type="uri" xlink:href="https://arxiv.org/pdf/1311.2901.pdf">https://arxiv.org/pdf/1311.2901.pdf</ext-link> (accessed November 28, 2013).</citation>
</ref>
<ref id="B93">
<label>93.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Liu</surname> <given-names>TYA</given-names></name> <name><surname>Bressler</surname> <given-names>NM</given-names></name></person-group>. <article-title>Controversies in artificial intelligence</article-title>. <source>Curr Opin Ophthalmol.</source> (<year>2020</year>) <volume>31</volume>:<fpage>324</fpage>&#x02013;<lpage>8</lpage>. <pub-id pub-id-type="doi">10.1097/icu.0000000000000694</pub-id><pub-id pub-id-type="pmid">32769696</pub-id></citation></ref>
<ref id="B94">
<label>94.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Shahbaz</surname> <given-names>R</given-names></name> <name><surname>Salducci</surname> <given-names>M</given-names></name></person-group>. <article-title>Law and order of modern ophthalmology: Teleophthalmology, smartphones legal and ethics</article-title>. <source>Eur J Ophthalmol.</source> (<year>2021</year>) <volume>31</volume>:<fpage>13</fpage>&#x02013;<lpage>21</lpage>. <pub-id pub-id-type="doi">10.1177/1120672120934405</pub-id><pub-id pub-id-type="pmid">32544988</pub-id></citation></ref>
<ref id="B95">
<label>95.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sullivan</surname> <given-names>HR</given-names></name> <name><surname>Schweikart</surname> <given-names>SJ</given-names></name></person-group>. <article-title>Are current tort liability doctrines adequate for addressing injury caused by AI?</article-title> <source>AMA J Ethics.</source> (<year>2019</year>) <volume>21</volume>:<fpage>E160</fpage>. <pub-id pub-id-type="doi">10.1001/amajethics.2019.160</pub-id><pub-id pub-id-type="pmid">30794126</pub-id></citation></ref>
</ref-list>
<glossary>
<def-list>
<title>Abbreviations</title>
<def-item><term>AI</term>
<def><p>Artificial intelligence</p></def></def-item>
<def-item><term>OCT</term>
<def><p>Optical coherence tomography</p></def></def-item>
<def-item><term>ML</term>
<def><p>Machine learning</p></def></def-item>
<def-item><term>DL</term>
<def><p>Deep learning</p></def></def-item>
<def-item><term>ANNs</term>
<def><p>Artificial neural networks</p></def></def-item>
<def-item><term>CNNs</term>
<def><p>Convolutional neural networks</p></def></def-item>
<def-item><term>RNNs</term>
<def><p>Recurrent neural networks</p></def></def-item>
<def-item><term>ROC</term>
<def><p>Receiver operating characteristic curve</p></def></def-item>
<def-item><term>FPR</term>
<def><p>False positive rate</p></def></def-item>
<def-item><term>TPR</term>
<def><p>True positive rate</p></def></def-item>
<def-item><term>AUC</term>
<def><p>Area under the curve</p></def></def-item>
<def-item><term>5G</term>
<def><p>5th generation mobile communication technology</p></def></def-item>
<def-item><term>MMD</term>
<def><p>myopic macular degeneration</p></def></def-item>
<def-item><term>PM</term>
<def><p>Pathologic myopia</p></def></def-item>
<def-item><term>LASEK</term>
<def><p>Laser epithelial keratomileusis</p></def></def-item>
<def-item><term>LASIK</term>
<def><p>Laser <italic>in situ</italic> keratomileusis</p></def></def-item>
<def-item><term>SMILE</term>
<def><p>Small incision lenticular extraction</p></def></def-item>
<def-item><term>PIOL</term>
<def><p>Phakic intraocular lens</p></def></def-item>
<def-item><term>IOL</term>
<def><p>Intraocular lens</p></def></def-item>
<def-item><term>AL</term>
<def><p>Axial length</p></def></def-item>
<def-item><term>ACD</term>
<def><p>Anterior chamber depth</p></def></def-item>
<def-item><term>BUII</term>
<def><p>Barrett Universal II</p></def></def-item>
<def-item><term>RBF</term>
<def><p>Radial basis function.</p></def></def-item>
</def-list>
</glossary> 
</back>
</article> 