The spread of epidemics, especially COVID-19, is having a significant impact on the world. If an epidemic is not properly controlled at the beginning, it is likely to spread rapidly and widely through the coexistence relationship between natural and social systems. A university community is a special, micro-self-organized social system that is densely populated. However, university authorities in such an environment seem to be less cautious in the defence of an epidemic. Currently, there is almost no quantitative research on epidemic (...) spreading and response strategies in universities. In this paper, a case study of a university community is considered for a simulation of an infection evolving after an epidemic outbreak based on the method of system dynamics of the three stages. The results show the following: By improving the speed of the initial emergency response, the total number of patients can be effectively controlled. A quarantine policy helps to slow down the evolution of infection. The higher the isolation ratio, the higher the cost; therefore, the isolation ratio should be optimized. It is important to make emergency plans for controlling epidemic spreading and carry out emergency drills and assessments regularly. According to the results of this study, we suggest an emergency management framework for public health events in university communities. (shrink)
I am working as both a TEFL teacher and an SLA researcher in China, doing SLA research. Recently, I have been working on new approaches to data analysis and I’ve found that a book titled “Data Visualization and Analysis in Second Language Research” by Dr. Guilherme D. Garcia is of great significance in empirical research in the field of SLA. This book serves only as a practical and user-friendly guide to beginners involved in SLA research, but also navigation to veteran (...) SLA researchers devoted to new perspectives in data analysis. So far, the author has made his first attempt to connect data visualization under R to SLA research. The author reviews the previous research results and suggests some modifications through running R. From my perspective, it fills a gap in the SLA research practice—to lead readers into a new track of visualizing their reported results. Overall, to make an increasingly higher demand for SLA research, this book is particularly designated for quantitative data analysis in SLA. It is strongly recommended to write a book review to introduce readers to the realm of data visualization and help most SLA researchers better understand the potential value of R and visualization in SLA research. (shrink)
This study aims to demonstrate a detailed knowledge map of teacher identity research via a 20-year data set from the Web of Science database. A bibliometric analysis was employed for analyzing the articles published between 2001 and 2021 to show the status of teacher identity research in the past 20 years, research topics on teacher identity, and future research directions. Using the keyword “teacher identity” and filtering data by selecting articles and early access in teaching and education, 848 articles were (...) retrieved. Through production, content, and citation analysis with the help of a bibliometric tool, this study found that teacher identity remained a popular research theme in the academic field over the past 20 years, and its booming production involved many authors, institutions, and sources, and countries. Furthermore, teachers’ “beliefs,” “emotions,” “professional development,” and “context” impacting the construction and reconstruction of teacher identity were the popular topics in teacher identity research, and fundamental issues, including “identity,” “teacher identity,” “professional identity,” “development,” “teacher development,” “beliefs,” and “intersectionality” of teacher identity keep good topics in future research. (shrink)
In order to improve the translation quality of complex English sentences, this paper investigated unknown words. First, two baseline models, the recurrent neural machine translation model and the transformer model, were briefly introduced. Then, the unknown words were identified and replaced based on WordNet and the semantic environment and input to the neural machine translation model for translation. Finally, experiments were conducted on several National Institute of Standards and Technology datasets. It was found that the transformer model significantly outperformed the (...) RNMT model, its average bilingual evaluation understudy value was 42.14, which was 6.96 higher than the RNMT model, and its translation error rate was also smaller. After combining the intelligent algorithm, the BLEU values of both models improved, and the TER became smaller; the average BLEU value of the transformer model combined with the intelligent algorithm was 43.7, and the average TER was 57.68. The experiment verifies that the transformer model combined with the intelligent algorithm is reliable in translating complex sentences and can obtain higher-quality translation results. (shrink)
Complex overburdens often distort reservoir images in terms of structural positioning, stratigraphic resolution, and amplitude fidelity. One prime example of a complex overburden is in the deepwater Gulf of Mexico, where thick and irregular layers of remobilized salt are situated above prospective reservoir intervals. The highly variant salt layers create large lateral velocity variations that distort wave propagation and the illumination of deeper reservoir targets. In subsalt imaging, tools such as reflection tomography, full-waveform inversion, and detailed salt interpretation are needed (...) to derive a high-resolution velocity model that captures the lateral velocity variations. Once a velocity field is obtained, reverse time migration can be applied to restore structural positioning of events below and around the salt. However, RTM by nature is unable to fully recover the reflectivity for desired amplitudes and resolution. This shortcoming is well-recognized by the imaging community, and it has propelled the emergence of least-squares RTM in recent years. We have investigated how current LSRTM methods perform on subsalt images. First, we compared the formulation of data-domain versus image-domain least-squares migration, as well as methods using single-iteration approximation versus iterative inversion. Then, we examined the resulting subsalt images of several LSRTM methods applied on the synthetic and field data. Among our tests, we found that image-domain single-iteration LSRTM methods, including an extension of an approximate inverse Hessian method in the curvelet domain, not only compensated for amplitude loss due to poor illumination caused by complex salt bodies, but it also produced subsalt images with fewer migration artifacts in the field data. In contrast, an iterative inversion method showed its potential for broadening the bandwidth in the subsalt, but it was less effective in reducing migration artifacts and noise. Based on our understanding, we evaluated the current state of LSRTM for subsalt imaging. (shrink)
Recently, multigranularity has been an interesting topic, since different levels of granularity can provide different information from the viewpoint of Granular Computing. However, established researches have focused less on investigating attribute reduction from multigranularity view. This paper proposes an algorithm based on the multigranularity view. To construct a framework of multigranularity attribute reduction, two main problems can be addressed as follows: The multigranularity structure can be constructed firstly. In this paper, the multigranularity structure will be constructed based on the radii, (...) as different information granularities can be induced by employing different radii. Therefore, the neighborhood-based multigranularity can be constructed. The attribute reduction can be designed and realized from the viewpoint of multigranularity. Different from traditional process which computes reduct through employing a fixed granularity, our algorithm aims to obtain reduct from the viewpoint of multigranularity. To realize the new algorithm, two main processes are executed as follows: Considering that different decision classes may require different key condition attributes, the ensemble selector is applied among different decision classes; to accelerate the process of attribute reduction, only the finest and the coarsest granularities are employed. The experiments over 15 UCI data sets are conducted. Compared with the traditional single-granularity approach, the multigranularity algorithm can not only generate reduct which can provide better classification accuracy, but also reduce the elapsed time. This study suggests new trends for considering both the classification accuracy and the time efficiency with respect to the reduct. (shrink)
GraphicalPatients with early- to middle-stage PD were enrolled for C-Gait assessment and traditional walking ability assessments. The correlation of C-Gait assessment and traditional walking tests were studied. Two models were established based on C-Gait assessment and traditional walking tests to explore the value of C-Gait assessment in predicting freezing of gait.ObjectiveEfficient methods for assessing walking adaptability in individuals with Parkinson’s disease are urgently needed. Therefore, this study aimed to assess C-Gait for detecting freezing of gait in patients with early- to (...) middle-stage PD.MethodPeople with PD diagnosis were recruited from April 2019 to November 2019 in Beijing Rehabilitation Hospital. The participants performed six items of walking adaptability on an instrumented treadmill augmented with visual targets and obstacles. The patient’s walking adaptability was evaluated by C-Gait assessment and traditional walking tests, and FOG-related indexes were collected as outcome measures. Two discriminant models were established by stepwise discriminant analysis; area under the receiver operating characteristic curve was used to validate the models.ResultIn total, 53 patients were included in this study. Most C-Gait assessment items had no or low correlations with traditional walking tests. The obstacle avoidance and speed of adaptation items could lead to FOG with high sensitivity. In addition, the C-Gait assessment model had slightly better discrimination of freezers from non-freezers compared with traditional walking test models ; specifically, obstacle avoidance and speed of adaptation have uniquely discriminant potential.ConclusionC-gait assessment could provide additional value to the traditional walking tests for PD. Gait adaptability assessment, as measured by C-Gait, may be able to help identify freezers in a PD population. (shrink)