Search Results

Desire Lines: Dérive in Heterotopias
This study provides an examination and application of heterotopic dérive, a concept that combines spatial theories originated by Foucault and psychogeographical methods advocated by the Situationists, as enacted within theatrical performance spaces. The first chapter reviews theories related to space, place, and heterotopias, as well as the psychogeographical methods of the Situationists, particularly the dérive. The literature review is augmented with accounts of my experiences of serendipitous heterotopic dérive over a period of several years as a cast member in, or a technical director for, theatrical productions in the Department of Communication Studies Black Box Theatre. Based on the review, I postulate that heterotopic dérive is a potentially valuable phenomenon that performance studies scholar/artists can utilize consciously in the rehearsal process for mounting theatrical performances. To test this proposition, I worked collaboratively with a theatrical cast to craft a devised performance, Desire Lines, with a conscious effort to engender heterotopic dérive in the process of creating the performance. This performance served as the basis for the second chapter of the study, which analyzes and discusses of the results of that investigation. This project enhances understanding of the significance of the places and spaces in which performers practice their craft, and argues for the potential of recognizing and utilizing the agency of heterotopic spaces such as the Black Box.
A Detailed Investigation, Comparison, and Analysis of the Practice Habits of Undergraduate Vocal and Piano Performance Majors
For musicians of all kinds, practice is an essential component in establishing and refining their skills. How a musician learns the art of practicing, and at what point in their musical and cognitive development can vary drastically. The purpose of this research is to understand how two groups of musicians, undergraduate vocal performance majors and undergraduate piano performance majors, developed (or consequently failed to develop) their respective knowledge pertaining to effective practice prior to entering the university setting, and how their practice habits changed (or consequently failed to change) after beginning study with a university instructor. This is accomplished by comparing the practice habits of the two groups prior to entering the university setting, and, after gaining admission into the degree program. Findings are supplemented with recent research pertaining to the study of learning and various types of practice.
Detecting Component Failures and Critical Components in Safety Critical Embedded Systems using Fault Tree Analysis
Component failures can result in catastrophic behaviors in safety critical embedded systems, sometimes resulting in loss of life. Component failures can be treated as off nominal behaviors (ONBs) with respect to the components and sub systems involved in an embedded system. A lot of research is being carried out to tackle the problem of ONBs. These approaches are mainly focused on the states (i.e., desired and undesired states of a system at a given point of time to detect ONBs). In this paper, an approach is discussed to detect component failures and critical components of an embedded system. The approach is based on fault tree analysis (FTA), applied to the requirements specification of embedded systems at design time to find out the relationship between individual component failures and overall system failure. FTA helps in determining both qualitative and quantitative relationship between component failures and system failure. Analyzing the system at design time helps in detecting component failures and critical components and helps in devising strategies to mitigate component failures at design time and improve overall safety and reliability of a system.
Detection and Classification of Heart Sounds Using a Heart-Mobile Interface
An early detection of heart disease can save lives, caution individuals and also help to determine the type of treatment to be given to the patients. The first test of diagnosing a heart disease is through auscultation - listening to the heart sounds. The interpretation of heart sounds is subjective and requires a professional skill to identify the abnormalities in these sounds. A medical practitioner uses a stethoscope to perform an initial screening by listening for irregular sounds from the patient's chest. Later, echocardiography and electrocardiography tests are taken for further diagnosis. However, these tests are expensive and require specialized technicians to operate. A simple and economical way is vital for monitoring in homecare or rural hospitals and urban clinics. This dissertation is focused on developing a patient-centered device for initial screening of the heart sounds that is both low cost and can be used by the users on themselves, and later share the readings with the healthcare providers. An innovative mobile health service platform is created for analyzing and classifying heart sounds. Certain properties of heart sounds have to be evaluated to identify the irregularities such as the number of heart beats and gallops, intensity, frequency, and duration. Since heart sounds are generated in low frequencies, human ears tend to miss certain sounds as the high frequency sounds mask the lower ones. Therefore, this dissertation provides a solution to process the heart sounds using several signal processing techniques, identifies the features in the heart sounds and finally classifies them. This dissertation enables remote patient monitoring through the integration of advanced wireless communications and a customized low-cost stethoscope. It also permits remote management of patients' cardiac status while maximizing patient mobility. The smartphone application facilities recording, processing, visualizing, listening, and classifying heart sounds. The application also generates an electronic medical …
Detection of Generalizable Clone Security Coding Bugs Using Graphs and Learning Algorithms
This research methodology isolates coding properties and identifies the probability of security vulnerabilities using machine learning and historical data. Several approaches characterize the effectiveness of detecting security-related bugs that manifest as vulnerabilities, but none utilize vulnerability patch information. The main contribution of this research is a framework to analyze LLVM Intermediate Representation Code and merging core source code representations using source code properties. This research is beneficial because it allows source programs to be transformed into a graphical form and users can extract specific code properties related to vulnerable functions. The result is an improved approach to detect, identify, and track software system vulnerabilities based on a performance evaluation. The methodology uses historical function level vulnerability information, unique feature extraction techniques, a novel code property graph, and learning algorithms to minimize the amount of end user domain knowledge necessary to detect vulnerabilities in applications. The analysis shows approximately 99% precision and recall to detect known vulnerabilities in the National Institute of Standards and Technology (NIST) Software Assurance Metrics and Tool Evaluation (SAMATE) project. Furthermore, 72% percent of the historical vulnerabilities in the OpenSSL testing environment were detected using a linear support vector classifier (SVC) model.
Detection of Harmful Chemicals in the Air using Portable Membrane Inlet Mass Spectrometry
Portable mass spectrometry has become an important analytical tool for chemical detection and identification outside of a lab setting. Many variations and applications have been developed to benefit various fields of science. Membrane inlet mass spectrometry is used to allow certain analytes to pass into the mass spectrometer without breaking vacuum or letting in large particulate matter. These two important analytical tools have been applied to the detection of harmful chemicals in the air. Earth-based separations and reverse gas stack modelling are useful mathematical tools that can be used to locate the source of a chemical release by back calculation. Earth-based separations studies the way different molecules will diffuse and separate through the air. Reverse gas stack modelling refers to the concentration differences of a chemical in relation to its distance from its source. These four analytical techniques can be combined to quickly and accurately locate various harmful chemical releases. The same system can be used for many applications and has been tested to detect harmful chemicals within and air-handling system. The monitoring of air-handling systems can greatly reduce the threat of harm to the building occupants by detecting hazardous chemicals and shutting off the air flow to minimize human exposure.
Detection of Mercury Among Avian Trophic Levels at Caddo Lake and Lake Lewisville, TX
Mercury (Hg) is a globally distributed toxicant that has been shown to have negative effects on birds. in the United States, avian taxa have been shown to possess high Hg concentrations in the northeast, Great Lakes and Everglades ecosystems; however, few studies have measured avian Hg concentrations in other geographic regions. Previous studies have documented high Hg concentrations in multiple organisms in east Texas, but birds were not included in these studies. the main objective of the present study was to quantify Hg concentrations in birds in differing trophic levels at Caddo Lake and Lake Lewisville, TX. Results suggest that Hg concentrations may be high enough to negatively impact some bird taxa, particularly those at high trophic levels, residing at both Caddo Lake and Lake Lewisville.
Detection of Temporal Events and Abnormal Images for Quality Analysis in Endoscopy Videos
Recent reports suggest that measuring the objective quality is very essential towards the success of colonoscopy. Several quality indicators (i.e. metrics) proposed in recent studies are implemented in software systems that compute real-time quality scores for routine screening colonoscopy. Most quality metrics are derived based on various temporal events occurred during the colonoscopy procedure. The location of the phase boundary between the insertion and the withdrawal phases and the amount of circumferential inspection are two such important temporal events. These two temporal events can be determined by analyzing various camera motions of the colonoscope. This dissertation put forward a novel method to estimate X, Y and Z directional motions of the colonoscope using motion vector templates. Since abnormalities of a WCE or a colonoscopy video can be found in a small number of frames (around 5% out of total frames), it is very helpful if a computer system can decide whether a frame has any mucosal abnormalities. Also, the number of detected abnormal lesions during a procedure is used as a quality indicator. Majority of the existing abnormal detection methods focus on detecting only one type of abnormality or the overall accuracies are somewhat low if the method tries to detect multiple abnormalities. Most abnormalities in endoscopy images have unique textures which are clearly distinguishable from normal textures. In this dissertation a new method is proposed that achieves the objective of detecting multiple abnormalities with a higher accuracy using a multi-texture analysis technique. The multi-texture analysis method is designed by representing WCE and colonoscopy image textures as textons.
Detection of Ulcerative Colitis Severity and Enhancement of Informative Frame Filtering Using Texture Analysis in Colonoscopy Videos
There are several types of disorders that affect our colon’s ability to function properly such as colorectal cancer, ulcerative colitis, diverticulitis, irritable bowel syndrome and colonic polyps. Automatic detection of these diseases would inform the endoscopist of possible sub-optimal inspection during the colonoscopy procedure as well as save time during post-procedure evaluation. But existing systems only detects few of those disorders like colonic polyps. In this dissertation, we address the automatic detection of another important disorder called ulcerative colitis. We propose a novel texture feature extraction technique to detect the severity of ulcerative colitis in block, image, and video levels. We also enhance the current informative frame filtering methods by detecting water and bubble frames using our proposed technique. Our feature extraction algorithm based on accumulation of pixel value difference provides better accuracy at faster speed than the existing methods making it highly suitable for real-time systems. We also propose a hybrid approach in which our feature method is combined with existing feature method(s) to provide even better accuracy. We extend the block and image level detection method to video level severity score calculation and shot segmentation. Also, the proposed novel feature extraction method can detect water and bubble frames in colonoscopy videos with very high accuracy in significantly less processing time even when clustering is used to reduce the training size by 10 times.
Determinacy-related Consequences on Limit Superiors
Laczkovich proved from ZF that, given a countable sequence of Borel sets on a perfect Polish space, if the limit superior along every subsequence was uncountable, then there was a particular subsequence whose intersection actually contained a perfect subset. Komjath later expanded the result to hold for analytic sets. In this paper, by adding AD and sometimes V=L(R) to our assumptions, we will extend the result further. This generalization will include the increasing of the length of the sequence to certain uncountable regular cardinals as well as removing any descriptive requirements on the sets.
The Determinants and Consequences of Empathic Parenting: Testing an Expansion of Belsky's Model of Parenting Using SEM
An understanding of factors that enhance empathic parenting behaviors is of considerable importance to the study of child development and to the development of parenting interventions to promote child adjustment. Moreover, gaining a better understanding of the factors that predict empathic parenting with older children is of interest since most research examining parental empathy focuses on infants. These were the goals of the current study. Guided by Belsky's 1984 process model of the determinants of parenting that impact child development, an expanded model of the determinants of parenting is proposed that includes various parent, child, and contextual factors of influence. Using data from a community sample, a partial least squares path analysis approach was employed to test the model's strength in predicting empathically attuned parenting with children ages 5 to 10 years and, ultimately, the child's psychoemotional functioning. Results support the expanded model; however, a reduced model was found to be superior and revealed unique relationships between the determinants of parenting. Specifically, a parent's psychoemotional functioning and childrearing beliefs and attitudes were found to be critical to the parent's ability to engage in empathic parenting behaviors. Other parent factors such as the parent's developmental history of abuse, maladaptive personality traits, and age, along with contextual factors and child characteristics, were found to influence parenting only indirectly through their impact on the parent's level of psychoemotional distress or childrearing beliefs and attitudes. Ultimately, the current findings support Belsky's claim that parent factors are the strongest predictors of empathic parenting. Implications of these findings are many. The results highlight the importance of assessing a parent's childrearing beliefs and attitudes and level of distress in conjunction with characteristics of the child when a family comes in for treatment. Moreover, the results identify many points of intervention to stopping the cycle of abuse.
Determinants and Impacts of Pinterest Consumer Experiences
Pinterest is one of the fastest growing social networking sites and is attracting the interest of retailers as an effective way to interact with consumers. The purpose of this study was to examine: 1) determinants and impacts of Pinterest consumer experiences. Specifically, this study examined the impacts of retailer reputation, trust, perceived ease of use, and perceived usefulness on Pinterest consumer experiences on retailer SNS. 2) To identify the impacts of Pinterest consumer experiences on consumer satisfaction, behavioral intention, and online retailer relationship. The instrument used existing scales drawn from the literature. A consumer panel (n = 300) of Pinterest users that connect to apparel retailers was used to collect data through an online consumer panel. Reputation is positively related to trust and to perceived ease of use. Perceived ease of use and usefulness significantly affected retailer Pinterest consumer experiences. The impact of Pinterest consumer experiences on satisfaction and behavior intention was positive and significant. Satisfaction and behavior intention also are significantly related to online retailer relationship. Results and business implications are discussed, as well are limitations and future research.
Determinants of Citizens’ 311 Use Behaviors: 311 Citizen-initiated Contact, Contact Channel Choice, and Frequent Use
Facing increasingly complex policy issues and diminishing citizen satisfaction with government and service performance, managing the quality of citizen relationship management has become a main challenge for public managers. Solutions to complex policy problems of service performance and low level of citizen participation often must be developed by encouraging citizens to make their voices heard through the various participation mechanisms. Reflecting on this need, the municipal governments in the U.S. have developed centralized customer systems for citizen relationship management. 311 centralized customer system (named 311 in this study) has the functions of citizen-initiated contact, service-coproduction, and transaction, and many local governments launch 311 to maintain or enhance their relationship with the public. Using 311 is an easy and free technically for citizens, but ensuring some degree of citizen engagement and citizens’ 311 use has been challenging for local public managers of municipalities. Despite calls for the importance of 311 in the service and information delivery process, fair treatment and access to use of governmental information, citizen participation, government responsiveness, and citizen satisfaction, to the best of our understanding, no empirical studies explore citizens’ 311 behaviors in the micro and individual level in the field of public administration. This dissertation provides a comprehensive understanding of the 311 centralized customer system, helps local public managers know citizens’ perceived perspectives toward the operation of 311, and assists these managers to develop an effective 311 system in municipalities. The dissertation’s main purpose is to clarify the importance of 311 to citizen relationship management and provide insights into citizens’ 311 use behaviors. More specifically, this dissertation tries to answers the following questions: a. Why do citizens use 311? Do the various groups of the population access and use 311 in San Francisco equally? If not, what factors influence the citizens’ 311 citizen-initiated contact behaviors? b. …
Determinants of Corporate Governance Choices: Evidence from Listed Foreign Firms on U.S. Stock Exchanges
This study analyzes corporate governance practices of foreign (non-U.S.) issuers listed on the New York Stock Exchange (NYSE) and Nasdaq. Specifically, I examine the extent to which these foreign issuers voluntarily comply with U.S. stock exchange corporate governance requirements applicable to domestic issuers. My sample consists of 201 foreign companies primarily domiciled in Brazil, China, Israel, and the United Kingdom. I find that 151 (75 per cent) of the sample firms do not elect to comply with any of the U.S. corporate governance requirements. Logistic regression analysis generally supports the hypotheses that conformance with U.S. GAAP and percentage of managerial ownership are positively associated, and that percentage ownership by major shareholders is negatively associated with foreign firms electing to comply with U.S. corporate governance rules. This evidence is relevant for regulators and investors.
Determinants of Effort and Associated Cardiovascular Response to a Behavioral Restraint Challenge
This study directly tested implications of motivation intensity theory on effort to restrain against a behavioral urge or impulse (i.e. restraint intensity). Two factors were manipulated—magnitude of an urge and the importance of successfully resisting it—with cardiovascular (CV) responses related to active coping measured. Male and female undergraduate students were presented with a mildly- or strongly evocative film clip with instructions to refrain from showing any facial response. Success was made more or less important through coordinated manipulations of outcome expectancy, ego-involvement, and performance assessment. As expected, systolic blood pressure responses assessed during the performance period were proportional to the evocativeness of the clip when importance was high, but low regardless of evocativeness when importance was low. These findings support a new conceptual analysis concerned with the determinants and CV correlates of restraint intensity. Implications of the study and associations with current self-regulatory literature are discussed.
Determinants of Mental Health Problems Among College Students
Many college students have reported struggling with mental health problems while dealing with challenging demands of college. The initial theoretical framework for this research was Pearlin's stress process model (SPM). Building on the SPM, the three additional mediating variables of perceived control, meaninglessness, and financial worries were added to create a composite model for the research. Mental health outcomes in the model were measured by a comprehensive range of factors, which included: psychological distress, suicide, substance abuse, and anger. Data were collected from a non-probability convenience sample of 463 undergraduate students attending a large state supported university in the southwestern region of the United States. Among the social status variables measured, being married, female, and white were significant predictors of poor mental health in the sampled college students. Poor self-image, feeling of meaninglessness, and worrying about current and future finances were significant mediating variables. Poor mental health could make individuals overwhelmed and discouraged. This is a formula for failure in college. The results of this study contribute to a better understanding of the correlates of mental health problems among college students. A greater understanding means that families and college administrations will have better ideas about how to intervene to reduce the stress of students and to focus the available and often limited resources to help young adults in their college experience.
Determinants of Outbound Cross-border Mergers and Acquisitions by Emerging Asian Acquirers
This dissertation identifies key determinants of outbound cross-border mergers and acquisitions (M&As) by emerging Asian acquirers during 2001-2012. Using a zero-inflated model that takes into account different mechanisms governing country pairs that never engage in cross-border M&As and country pairs that actively participate in cross-border M&As, I uncover unique patterns for emerging Asian acquirers. Emerging Asian acquirers originate from countries with lower corporate tax rates than those countries where their targets are located. Furthermore, the negative impact of an international double tax burden is significantly larger than that found in previous studies. While country governance differences and geographical and cultural differences are important determinants of international M&As, relative valuation effects are muted. Coefficients of these determinants vary substantially, depending on whether targets are located in developing or advanced nations. Also, determinants differ considerably between active and non-active players in cross-border M&As. Moreover, comparisons of empirical models illustrate that estimating a non-linear model and taking into account both the bounded nature and non-normal distributions of fractional response variables lead to different inferences from those drawn from a linear model estimated by the ordinary least squares method. Overall, emerging Asian acquirers approach the deals differently from patterns documented in developed markets. So, when evaluating foreign business combinations or devising policies, managers or policymakers should consider these differences.
Determinants of Principal Pay in the State of Texas
The purpose of the study was to examine district, campus, and community determinants of principal’s salaries using a spatial econometric framework. Among the district variables business tax (p = 0.001), property tax (p < .01), and the Herfindahl Index (measure of competition) were statistically significant indicators of principal salaries. The results suggest that more affluent districts tend to pay principals higher salaries, which was expected. Regarding campus characteristics, the percent of economically disadvantaged was not a statistically sound indicator (p = 0.468), but campus enrollment was significant (p = <.01). Interestingly as the percentage of economically disadvantaged students increased, the principal salary decreased. In contrast, as student enrollment increases the salary of principals increases, suggesting that principals of larger campuses earn higher salaries. Interestingly, student achievement was not a statistically significant predictor of principals’ salary given that pay for performance in Texas is at the forefront of political debate. Among the variables examined at the community level, only the percentage of homes owner occupied (p = 0.002) was found to be a statistically significant indicator of principal salary (p = .002). The lack of evidence on reforms, such as determinants of principal salary, points to data and research deficiencies to be addressed in order to learn more about their effects and make sound public policies. The paper utilized a spatial regression approach to examine the determinants of principal salary using data from the local, state, and national data sources. Principal salaries are viewed from several lenses in this study by considering effective outcomes of pay defined by actual salaries and market considerations for pay as defined by community, organizational, and human capital variables. Literature from the private sector as well as from the public school setting was used as a theoretical underpinning for the hypotheses set forth in this study. …
Determinants of the Applications to the Institutional Care in Turkey: Darulaceze Example
Although institutional care has started to be outmoded in the developed countries with development of different models of care, it still has a considerable place in the developing countries such as Turkey. This is because, changes in the demographic structure, extended family, and urban development of Turkey has brought about several aging problems leading older adults to end up in institutions. Loneliness was one of the significant reasons given in the Social Inquiry Survey of Applicants of Darulaceze Old-Age Institution and the basis for a micro level analysis in this study. Therefore, the main objective of the study was to determine the predictors of loneliness, including age, the state of living alone, functional independence, education, and gender. Analysis of the results indicated that these predictors have significant effects on the loneliness predominantly defined by social factors rather than medical factors. In addition, the meso and macro level analyses were employed to control the micro level analysis and see a general picture of institutional care. Thus, an academic example of diagnosing the main reasons behind the institutional care was presented to understand the context of aging in Turkey.
Determinants of Women's Autonomy in Nepal
Nepal in recent times has witnessed a proliferation of community-based organization (CBOs). Established by local residents, CBOs are small level organizations that promote and defend the rights and interests of people especially that of minorities and the disadvantaged. One such minority group that CBOs greatly focus on are women. Despite dramatic increase in the number of CBOs in Nepal its impact on women is understudied. The purpose of this dissertation is to analyze the relationship between Nepalese women's participation in CBOs and their autonomy. Autonomy comprises of four different dimensions; physical mobility, financial autonomy, household decision-making, and reproductive autonomy. Modifying the conceptual framework used by Mahmud, Shah, and Becker in 2012, I hypothesize that women who participate in CBOs experience greater autonomy. Data from the 2008 Chitwan Valley Family Study is used for analysis. Using SPSS, separate logistic regressions are run to analyze the relationship between CBO membership and the dimensions of autonomy. The results support three of the four proposed major hypotheses. Nepalese women who participate in CBOs have greater autonomy in terms of physical mobility, financial autonomy, and household decision-making. No evidence was found to establish link between CBO membership and reproductive autonomy. The variables that are controlled for in the study include age, caste, religion, education, marital status, exposure to television, exposure to radio, and relationship with one's mother-in-law.
Determination of Bioconcentration Potential of Selected Pharmaceuticals in Fathead Minnow, Pimephales promelas, and Channel Catfish, Ictalurus punctatus
The primary objective of this work was to determine the tissue-specific bioconcentration factors (BCFs) of the selected pharmaceuticals: norethindrone (NET), ibuprofen (IBU), verapamil (VER), clozapine (CLZ) and fenofibrate (FFB) in two freshwater fishes: fathead minnow and channel catfish. BCF tests on fathead followed the standard OECD 42-day test while a 14-day abridged test design was used in catfish exposures. Additional objectives included a) comparing the measured BCFs to the US EPA's BCFWIN model predicted values, b) comparing the BCF results from the standard and reduced tests, and c) prediction of chronic risk of the pharmaceuticals in fish using the human therapeutic plasma concentrations. Each test included uptake and depuration phases to measure tissue-specific kinetic BCFs. The results indicated that all the pharmaceuticals, except IBU, have the potential for accumulation in fish. Estimated BCFs for NET, VER and FFB may not be significant in view of the current regulatory trigger level (BCF &#8805; 2000); however, CLZ's BCF in the liver had approached the criterion level. Significant differences were noticed in the tissue-specific uptake levels of the pharmaceuticals with the following general trend: (liver/kidney) > (gill/brain) > (heart/muscle) > plasma. IBU uptake was highest in the plasma. When compared to the measured BCFs, predicted values for NET, IBU, VER and FFB were slightly overestimated but did not differ largely. However, the measured BCF of CLZ in the liver was approximately two-orders of magnitude higher than the predicted level. The tissue-BCFs for the two species were not widely different indicating the potential usefulness of the reduced BCF test. Comparison of fish and human plasma levels indicated that NET, CLZ and VER have the potential to cause chronic effects in fish.
Determination of Molecular Descriptors for Illegal Drugs by Gc-fid Using Abraham Solvation Model
The Abraham solvation parameter model is a good approach for analyzing and predicting biological activities and partitioning coefficients. The general solvation equation has been used to predict the solute property (SP) behavior of drug compounds between biological barriers. Gas chromatography (GC) retention time can be used to predict molecular descriptors, such as E, S, A, B & L for existing and newly developed drug compounds. In this research, six columns of different stationary phases were used to predict the Abraham molecular descriptors more accurately. The six stationary phases used were 5% phenylmethyl polysiloxane, 6% cyanopropylphenyl 94% dimethylpolysiloxane, 5% diphenyl 95% dimethylpolysiloxane, 100% dimethylpolysiloxane, polyethylene glycol and 35% diphenyl 65% dimethylpolysiloxane. Retention times (RT) of 75 compounds have been measured and logarithm of experimental average retention time Ln(RTexp) are calculated. The Abraham solvation model is then applied to predict the process coefficients of these compounds using the literature values of the molecular descriptors (Acree Compilation descriptors). Six correlation equations are built up as a training set for each of the six columns. The six equations are then used to predict the molecular descriptors of the illegal drugs as a test set. This work shows the ability to extract molecular information from a new compound by utilizing commonly used GC columns available with the desired stationary phases. One can simply run the new compound in GC using these columns to get the retention time. Plugging in the retention time into the developed equations for each of the column will predict the molecular descriptors for the test compound and will give some information about the properties of the compound.
Determination of Solute Descriptors for Illicit Drugs Using Gas Chromatographic Retention Data and Abraham Solvation Model
In this experiment, more than one hundred volatile organic compounds were analyzed with the gas chromatograph. Six capillary columns ZB wax plus, ZB 35, TR1MS, TR5, TG5MS and TG1301MS with different polarities have been used for separation of compounds and illicit drugs. The Abraham solvation model has five solute descriptors. The solute descriptors are E, S, A, B, L (or V). Based on the six stationary phases, six equations were constructed as a training set for each of the six columns. The six equations served to calculate the solute descriptors for a set of illicit drugs. Drugs studied are nicotine (S= 0.870, A= 0.000, B= 1.073), oxycodone(S= 2.564. A= 0.286, B= 1.706), methamphetamine (S= 0.297, A= 1.570, B= 1.009), heroin (S=2.224, A= 0.000, B= 2.136) and ketamine (S= 1.005, A= 0.000, B= 1.126). The solute property of Abraham solvation model is represented as a logarithm of retention time, thus the logarithm of experimental and calculated retention times is compared.
A Determination of the Fine Structure Constant Using Precision Measurements of Helium Fine Structure
Spectroscopic measurements of the helium atom are performed to high precision using an atomic beam apparatus and electro-optic laser techniques. These measurements, in addition to serving as a test of helium theory, also provide a new determination of the fine structure constant &#945;. An apparatus was designed and built to overcome limitations encountered in a previous experiment. Not only did this allow an improved level of precision but also enabled new consistency checks, including an extremely useful measurement in 3He. I discuss the details of the experimental setup along with the major changes and improvements. A new value for the J = 0 to 2 fine structure interval in the 23P state of 4He is measured to be 31 908 131.25(30) kHz. The 300 Hz precision of this result represents an improvement over previous results by more than a factor of three. Combined with the latest theoretical calculations, this yields a new determination of &#945; with better than 5 ppb uncertainty, &#945;-1 = 137.035 999 55(64).
Determination of the Optimal Number of Strata for Bias Reduction in Propensity Score Matching.
Previous research implementing stratification on the propensity score has generally relied on using five strata, based on prior theoretical groundwork and minimal empirical evidence as to the suitability of quintiles to adequately reduce bias in all cases and across all sample sizes. This study investigates bias reduction across varying number of strata and sample sizes via a large-scale simulation to determine the adequacy of quintiles for bias reduction under all conditions. Sample sizes ranged from 100 to 50,000 and strata from 3 to 20. Both the percentage of bias reduction and the standardized selection bias were examined. The results show that while the particular covariates in the simulation met certain criteria with five strata that greater bias reduction could be achieved by increasing the number of strata, especially with larger sample sizes. Simulation code written in R is included.
Determining Factors that Influence High School Principal Turnover Over a Five Year Period
The purpose of this study was to determine the effects of salary, compensation and benefits, accountability, job stress, increased instructional responsibilities, changes in student demographics, lack of support, politics, advancement opportunities and promotion on tenure and turnover among high school principals in the state of Texas. The participants in the study included 60 Texas high school principals who left a high school principalship for a different high school principalship within the past 5 years. The participants completed the Texas Principal Survey and data were analyzed using binary logistic regression. The data indicated that salary, compensation and benefits was a significant factor in predicting an increase in the odds of principal turnover for principals who had been in their prior principalship 5 or more years over principals who had been in their prior principalship less than 5 years. Additionally, advancement opportunities was a significant factor in predicting a decrease in the odds of principal turnover for principals who had been in their prior principalship 5 or more years over principals who had been in their prior principalship less than 5 years. Responses from an open ended question asking principals why they left their prior principalship suggested that principals left for reasons including new challenges, lack of support and family. The results of this study support the need for continued research in the area of principal turnover and provide insight to district superintendents, school boards and principals.
Determining the Authenticity of the Concerto for Two Horns, Woo 19, Attributed to Ferdinand Ries
Ferdinand Ries is credited as the composer of the Concerto for Two Horns, WoO. 19 preserved in the Berlin State Library. Dated 1811, ostensibly Ries wrote it in the same year as his Horn Sonata, Op. 34, yet the writing for the horns in the Concerto is significantly more demanding. Furthermore, Ries added to the mystery by not claiming the Concerto in his personal catalog of works or mentioning it in any surviving correspondence. The purpose of this dissertation is to study the authorship of the Concerto for Two Horns and offer possible explanations for the variance in horn writing. Biographical information of Ries is given followed by a stylistic analysis of Ries’s known works. A stylistic analysis of the Concerto for Two Horns, WoO. 19 is offered, including a handwriting comparison between the Concerto for Two Horns and Ries’s Horn Sonata. Finally, possible explanations are proposed that rationalize the variance in horn writing between the Concerto for Two Horns, WoO. 19 and Ries’s other compositions that include the horn.
Determining the Diagnostic Accuracy of and Interpretation Guidelines for the Complex Trauma Inventory (CTI)
The work group in charge of editing the trauma disorders in the upcoming edition of the International Classification of Diseases (ICD-11) made several changes to the trauma criteria. Specifically, they simplified the criteria for posttraumatic stress disorder (PTSD) and added a new trauma disorder called complex PTSD (CPTSD). To assess the new and newly defined trauma disorders, Litvin, Kaminski and Riggs developed a self-report trauma measure called the Complex Trauma Inventory (CTI). Although the reliability and validity of the CTI has been supported, no empirically-derived cutoff scores exist. We determined the optimal CTI cutoff scores using receiver operating characteristic (ROC) analyses in a diverse sample of 82 participants who experienced trauma and were recruited from an inpatient trauma unit, student veteran organizations, and university classrooms. We used the Clinician-Administered Interview for Trauma Disorders (CAIT) to diagnose the presence of an ICD-11 trauma disorder, and we correlated the results of the CAIT with the Clinician-Administered PTSD Scale for the DSM-5 to establish the convergent validity of the CAIT, r = .945, p < .001. For the ROC analyses, the CTI was used as the index test and the CAIT was used as the criterion test. The area under the curve (AUC) analyses indicated good to excellent effect sizes, AUC = .879 to .904. We identified two sets of cutoff scores for the CTI: the first set prioritized the sensitivity of the CTI scores and ranged from .884 to .962; the second set prioritized the specificity of the CTI scores and the false-positive scores (1-specificity) ranged from .054 to .143. Our study enhanced the utility of the CTI and addressed another need in the trauma field by developing a structured clinical interview (CAIT) that can be used to diagnose the ICD-11 trauma disorders.
Determining the Emissivity of Roofing Samples: Asphalt, Ceramic and Coated Cedar
The goal is to perform heat measurements examine of selected roofing material samples. Those roofing materials are asphalt shingles, ceramics, and cedar. It’s important to understand the concept of heat transfer, which consists of conduction, convection, and radiation. Research work was reviewed on different infrared devices to see which one would be suitable for conducting my experiment. In this experiment, the main focus was on a specific property of radiation. That property is the emissivity, which is the amount of heat a material is able to radiate compared to a blackbody. An infrared measuring device, such as the infrared camera was used to determine the emissivity of each sample by using a measurement formula consisting of certain equations. These equations account for the emissivity, transmittance of heat through the atmosphere and temperatures of the samples, atmosphere and background. The experiment verifies how reasonable the data is compared to values in the emissivity table. A blackbody method such as electrical black tape was applied to help generate the correct data. With this data obtained, the emissivity was examined to understand what factors and parameters affect this property of the materials. This experiment was conducted using a suitable heat source to heat up the material samples to high temperature. The measurements were taken during the experiment and displayed by the IR camera. The IR images show the behavior of surface temperatures being distributed throughout the different materials. The main challenge was to determine the most accurate emissivity values for all material samples. The results obtained by the IR camera were displayed in figures and tables at different distances, which was between the heap lamp and materials. The materials exhibited different behaviors in temperature and emissivity at certain distances. The emissivity of each material varied with different temperatures. The results led to suggestions …
Determining the Relation Between the Moments of Acquisition of Baseline Conditional Discriminations and the Emergence of Equivalence Relations
The experiment was an attempt to gain a more precise understanding of the temporal relation between the development of analytic units and equivalence relations. Two prompting procedures were used during training to pinpoint when eight subjects learned the conditional discriminations. Near simultaneous presentation of probe and training trials allowed for examination of the temporal relation between conditional discrimination acquisition and derived performances on stimulus equivalence probes. The data show that, for seven of eight subjects, a decreased reliance on prompts was coincident with the development of equivalence-consistent choices on either all or some probe trials, which suggests that the development of analytic units is sufficient to give rise to equivalence relations among stimuli.
Determining the Reliability and Use of the Center for Community College Student Engagement Survey of Entering Student Engagement As a Tool to Predict Student Success in a Large Urban Community College District
As community colleges have gained more recognition as a viable pathway for students to enter higher education, they have faced greater accountability that has prompted both practitioners and policy makers to attempt to find solutions and tools, such as National Survey of Student Engagement, Community College Survey of Student Engagement, and Survey of Entering Student Engagement (SENSE), to aid in improving student success outcomes. This study addressed the validity and reliability of the SENSE instrument using a three-pronged approach via student data collected over 3 years of SENSE administrations at a large urban community college (n = 4,958). The instrument was first factor analyzed against the SENSE benchmarks for effective educational practice through generalized least squares and principal component exploratory factor analysis. Although the instrument did not deliver a chi-square factored fit for the six benchmark categories, consistent loadings were observed. Second, construct reliability was tested for each benchmark category, and the survey as a whole using Cronbach’s alpha. All categories did not yield sufficient coefficient scores for establishing construct reliability. However, the overall survey produced a Cronbach’s alpha of .85, clearly indicating construct reliability for all items combined. Third, correlations between SENSE perception scores and community college students’ grade point averages, fall to fall retention, semester credit hours, course completion for developmental and college gateway courses, and degree and certificate completion were calculated. Although no strong correlations were observed, the SENSE may be useful to community colleges seeking to increase completion rates.
Determining Whether and When People Participate in the Events They Tweet About
This work describes an approach to determine whether people participate in the events they tweet about. Specifically, we determine whether people are participants in events with respect to the tweet timestamp. We target all events expressed by verbs in tweets, including past, present and events that may occur in future. We define event participant as people directly involved in an event regardless of whether they are the agent, recipient or play another role. We present an annotation effort, guidelines and quality analysis with 1,096 event mentions. We discuss the label distributions and event behavior in the annotated corpus. We also explain several features used and a standard supervised machine learning approach to automatically determine if and when the author is a participant of the event in the tweet. We discuss trends in the results obtained and devise important conclusions.
Deva Plus Dog
Deva Plus Dog is a look at the life of a teenage girl singularly devoted to the sport of dog agility. The film explores how relationships develop and evolve in the high stakes world of competition, and how an alternative lifestyle impacts a youth’s coming of age.
Devaluing Stigma in the Context of Forgiveness, Coping and Adaptation: a Structural Regression Model of Reappraisal
The 2010 National HIV/AIDS Strategy outlined three important goals for managing the current HIV pandemic in the U.S.: (1) reduce the number of people who become infected with HIV; (2) improve access to health care and health-related outcomes for people living with HIV/AIDS (PLH/A); and (3) reduce HIV-related health disparities. Each of these goals tacitly depends upon reducing HIV-related stigma, and this study examined how HIV+ individuals evaluate coping efforts to overcome stigma’s impact on quality of life (QOL). a structural regression model was developed to instantiate the reappraisal process described by Lazarus and Folkman’s transactional theory of stress and coping, and this model indicated that maladaptive coping fully mediated the relationship between dispositional forgiveness and perceived stigma, which supports the prediction that coping efficacy is related to stress reduction. Additionally, maladaptive coping fully mediated the relationship between dispositional forgiveness and QOL, supporting the contention that forgiveness is a critical aspect to the evaluative process that influences how PLH/A cope with stigma. Lastly, the model showed that when PLH/A engaged in maladaptive coping to mitigate stress-related stigma, these individuals experienced increased stigmatization and reported significantly lower levels of health-related QOL. in contrast, PLH/A that reported higher levels of dispositional forgiveness were significantly less likely to use maladaptive coping to overcome stigma. Therefore, dispositional forgiveness works through coping to alter perceptions regarding stigmatization, while indirectly influencing attitudes related to health distress, mental health, and cognitive and social functioning. the theoretical and clinical implications of these findings are discussed.
Developer
A chapbook-length collection of poems.
Developing a Collection Digitization Workflow for the Elm Fork Natural Heritage Museum
Natural history collections house immense amounts of data, but the majority of data is only accessible by locating the collection label, which is usually attached to the physical specimen. This method of data retrieval is time consuming and can be very damaging to fragile specimens. Digitizing the collections is the one way to reduce the time and potential damage related to finding the collection objects. The Elm Fork Natural Heritage Museum is a natural history museum located at the University of North Texas and contains collections of both vertebrate and invertebrate taxa, as well as plants. This project designed a collection digitization workflow for Elm Fork by working through digitizing the Benjamin B. Harris Herbarium. The collection was cataloged in Specify 6, a database program designed for natural history collection management. By working through one of the museum’s collections, the project was able to identify and address challenges related to digitizing the museum’s holdings in order to create robust workflows. The project also produced a series of documents explaining common processes in Specify and a data management plan.
Developing a Forest Gap Model to Be Applied to a Watershed-scaled Landscape in the Cross Timbers Ecoregion Using a Topographic Wetness Index
A method was developed for extending a fine-scaled forest gap model to a watershed-scaled landscape, using the Eastern Cross Timbers ecoregion as a case study for the method. A topographic wetness index calculated from digital elevation data was used as a measure of hydrologic across the modeled landscape, and the gap model modified to have with a topographically-based hydrologic input parameter. The model was parameterized by terrain type units that were defined using combinations of USDA soil series and classes of the topographic wetness index. A number of issues regarding the sources, grid resolutions, and processing methods of the digital elevation data are addressed in this application of the topographic wetness index. Three different grid sizes, 5, 10, and 29 meter, from both LiDAR-derived and contour-derived elevation grids were used, and the grids were processed using both single-directional flow algorithm and bi-directional flow algorithm. The result of these different grids were compared and analyzed in context of their application in defining terrain types for the forest gap model. Refinements were made in the timescale of gap model’s weather model, converting it into a daily weather generator, in order to incorporate the effects of the new topographic/hydrologic input parameter. The precipitation model was converted to use a Markov model to initiate a sequence of wet and dry days for each month, and then daily precipitation amounts were determined using a gamma distribution. The output of the new precipitation model was analyzed and compared with a 100-year history of daily weather records at daily, monthly, and annual timescales. Model assumptions and requirements for biological parameters were thoroughly investigated and questioned. Often these biological parameters are based on little more than assumptions and intuition. An effort to base as many of the model’s biological parameters on measured data was made, including a new …
Developing a Multicontextual Model of High Schools whose Students Participate in Financial Aid Preparation Services: Family, School, and Community Level Effects
The purpose of this quantitative secondary data analysis was to examine the effect of family, school, and community context on high schools whose students participate in financial aid preparation services. Data from the High School Longitudinal Study of 2009 were analyzed to answer the two research questions using Perna's conceptual model of college enrollment behaviors that explores how students gain and utilize information about financial aid and college prices. Descriptive statistics were used to determine the extent of high school participation in financial aid services. The results indicated a varying degree of these interventions being offered at high schools ranging from 22% to 52%. Schools sending students reminders of FAFSA deadlines (52%) and disseminating flyers/pamphlets on financial aid (50%) were the only two interventions that had a slight majority of schools participating. Multiple regression was used to determine if a relationship existed between the outcome variable (participation in financial aid preparation services) and several family context and school context predictor variables for eight financial aid interventions. Results revealed school context variables as the best predictors of the outcome variable. Counselor caseload and school control were the most effective in predicting high school participation in the eight financial aid preparation services, though these greatly differed according to the type of intervention. Findings provide potential implications for research and practice, including highlighting ways in which K-12 and higher education can coordinate.
Developing a Phylogeny Based Machine Learning Algorithm for Metagenomics
Metagenomics is the study of the totality of the complete genetic elements discovered from a defined environment. Different from traditional microbiology study, which only analyzes a small percent of microbes that could survive in laboratory, metagenomics allows researchers to get entire genetic information from all the samples in the communities. So metagenomics enables understanding of the target environments and the hidden relationships between bacteria and diseases. In order to efficiently analyze the metagenomics data, cutting-edge technologies for analyzing the relationships among microbes and communities are required. To overcome the challenges brought by rapid growth in metagenomics datasets, advances in novel methodologies for interpreting metagenomics data are clearly needed. The first two chapters of this dissertation summarize and compare the widely-used methods in metagenomics and integrate these methods into pipelines. Properly analyzing metagenomics data requires a variety of bioinformatcis and statistical approaches to deal with different situations. The raw reads from sequencing centers need to be processed and denoised by several steps and then be further interpreted by ecological and statistical analysis. So understanding these algorithms and combining different approaches could potentially reduce the influence of noises and biases at different steps. And an efficient and accurate pipeline is important to robustly decipher the differences and functionality of bacteria in communities. Traditional statistical analysis and machine learning algorithms have their limitations on analyzing metagenomics data. Thus, rest three chapters describe a new phylogeny based machine learning and feature selection algorithm to overcome these problems. The new method outperforms traditional algorithms and can provide more robust candidate microbes for further analysis. With the frowing sample size, deep neural network could potentially describe more complicated characteristic of data and thus improve model accuracy. So a deep learning framework is designed on top of the shallow learning algorithm stated above in order to further …
Developing a Self-Respect Instrument to Distinguish Self-Respect from Self-Esteem
Throughout the scientific literature, researchers have referred to self-respect and self-esteem as being the same construct. However, the present study advocated that they exist as two distinct constructs. In this quantitative study, an instrument was developed to measure self-respect as a construct, and subsequently distinguish that self-respect is distinct from the construct of self-esteem. Exploratory factor analyses (EFA) indicated 32.60% of the variance was accounted for by the 11-item Jefferson Self-Respect instrument (JSR), which measured self-respect as a unidimensional construct. The reliability estimate of the scores from the JSR reached an acceptable α = .82. Fit indices (RMSEA = .031, SRMR = .037, CFI = .982, and TLI = .977) from the confirmatory factor analyses (CFA) signified a well-fitted hypothesized model of self-respect that existed as a unidimensional construct. Additionally, the CFA revealed that the construct of self-respect, and self-esteem was generally distinct, and the strength of the correlation between the two constructs was moderately positive (r = .62).
Developing a Soil Moisture-Based Irrigation Scheduling Tool (SMIST) Using Web-GIS Technology
Software as a service (SaaS) is a primary working pattern and a significant application model for next generation Internet application. Web GIS services are the new generation of the Software as a service that can provide the hosted spatial data and GIS functionalities to the practical customized applications. This study focused on developing a webGIS based application, Soil Moisture-Based Irrigation Scheduling Tool (SMIST), for predicting soil moisture in the next seven days using the soil moisture diagnostic equation (SMDE) and the upcoming seven precipitation forecasts made by the National Weather Service (NWS), and ultimately producing an accurate irrigation schedule based on the predicted soil moisture. The SMIST is expected to be capable of improving the irrigation efficiency to protect groundwater resources in the Texas High Plains and reducing the cost of energy for pumping groundwater for irrigation, as an essential public concern in this area. The SMIST comprised an integration of web-based programs, a Hydrometeorological model, GIS, and geodatabase. It integrates two main web systems, the soil moisture estimating web application for irrigation scheduling based on the soil moisture diagnostic equation (SMDE), and an agricultural field delineation webGIS application to prepare input data and the model parameters. The SMIST takes advantage of the latest historical and forecasted precipitation data to predict soil moisture in the user-specified agricultural field(s). In this regard, the next seven days soil moisture versus the soil moisture threshold for normal growth would be presented in the result page of the SMIST to help users to adjust irrigation rate and sequence.
Developing an Integrated Supply Chain Costing Approach for Strategic Decision Making
The supply chain management discipline suggests that information sharing is paramount when attempting to achieve cost reductions and quality improvements. In many cases, the traditional accounting data used to support strategic decisions reflect inaccurate supply chain costs. This research explores the applications of managerial costing techniques, and how they can be used to improve the decision making capabilities of firms in the aerospace and transportation industries. The methodology used to address the research questions consisted of a hybrid of the grounded theory and multiple-case study methods. The objective of this research was to present the antecedents and barriers associated with implementing supply chain costing, and the impact that costing approaches have on strategic decision making. The research identifies a theoretical model that can be used to explain the relationships and themes associated with supply chain costing and strategic decision making. Evidence suggests that there is some movement to implement managerial accounting techniques within these two industries to capture supply chain costing information. However, the reliance on traditional financial accounting suggests that the overarching principles of supply chain management and information sharing amongst of partner firms has yet to be realized.
Developing Culturally Responsive Literacy Teachers: Analysis of Academic, Demographic, and Experiential Factors Related to Teacher Self-efficacy
This mixed-methods study examined teachers' culturally responsive teaching (CRT) self-efficacy beliefs and the relationships among selected academic, demographic, and experiential factors. Guided by theoretical and empirical research on CRT, teacher dispositions, and assessment in teacher education (TE) programs for culturally and linguistically diverse (CLD) students, this study utilized an extended version of Siwatu's 2007 Culturally Responsive Teaching Self-Efficacy (CRTSE) Scale to conduct correlational and comparative statistical analyses. Data sources included surveys from 265 participants enrolled in TE classes in the spring 2012 in Texas (one private and one public university). Content analyses were also conducted on participants' descriptions of CRT activities using a priori and inductive coding methods to triangulate and elaborate the explanation of quantitative results. In this population, those with higher CRTSE were typically young (undergraduates), specializing in ESL and bilingual certification coursework, who felt their TE program prepared them well for working with CLD student populations. Regression analyses showed that certain certification areas (ESL, bilingual, elementary, and advanced) and perceptions of better quality in TE program preparation for working with CLD students emerged as significant predictors of increased CRTSE. Those with second language skills were more efficacious in delivering linguistically-responsive instruction, and those professing more experiences with and interest in diverse individuals felt more confident in applying CRT skills. While the younger teacher candidates felt more efficacious, their descriptions of CRT were less sophisticated than those with more teaching experience. Despite much of the literature relating to CRT and minority teachers, ethnicity was not a significant factor in heightened CRTSE. This study informs TE programs for better measuring and supporting teacher candidate CRT development by revising and extending Siwatu's 2007 study in three ways. First, the CRTSE Scale instrument was extended to include items that address greater depth and breadth of the culturally responsive teaching continuum as …
Developing Oral Reading Fluency Among Hispanic High School English-language Learners: an Intervention Using Speech Recognition Software
This study investigated oral reading fluency development among Hispanic high school English-language learners. Participants included 11 males and 9 females from first-year, second-year, and third-year English language arts classes. The pre-post experimental study, which was conducted during a four-week ESL summer program, included a treatment and a control group. The treatment group received a combination of components, including modified repeated reading with self-voice listening and oral dictation output from a speech recognition program. Each day, students performed a series of tasks, including dictation of part of the previous day’s passage; listening to and silently reading a new passage; dictating and correcting individual sentences from the new passage in the speech recognition environment; dictating the new passage as a whole without making corrections; and finally, listening to their own voice from their recorded dictation. This sequence was repeated in the subsequent sessions. Thus, this intervention was a technology-enhanced variation of repeated reading with a pronunciation dictation segment. Research questions focused on improvements in oral reading accuracy and rate, facility with the application, student perceptions toward the technology for reading, and the reliability of the speech recognition program. The treatment group improved oral reading accuracy by 50%, retained and transferred pronunciation of 55% of new vocabulary, and increased oral reading rate 16 words-correct-per-minute. Students used the intervention independently after three sessions. This independence may have contributed to students’ self-efficacy as they perceived improvements in their pronunciation, reading in general, and reported an increased liking of school. Students initially had a very positive perception toward using the technology for reading, but this perception decreased over the four weeks from 2.7 to 2.4 on a 3 point scale. The speech recognition program was reliable 94% of the time. The combination of the summer school program and intervention component stacking supported students’ gains in oral …
Developing Policy for a Tech Program Based on Understanding Organizational Practices
This thesis contributes to research that informs the studies of organizational management and organizational anthropology. It examines the internal hierarchy and organizational practices of a Tech Company and describes how findings contributed to policy recommendations aimed towards supporting a “guild” model for organizational success. The data collecting and research were undertaken while working as an employee of the Tech Program and subsequent analysis continued past the end of that phase of work. Methods included semi-structured interviews which captured the sentiments and understandings of employees within the organization, and a questionnaire which revealed sentiments and experiences from former employees. These were buttressed with participant observation engaged through a participatory action research methodology. Findings add to the work directed towards understanding the effect of Founder’s Syndrome within organizations. Additionally, this thesis contributes to a growing body of research centered on best practices for fostering positive organizational growth by creating lines of communication from front-line employees to management level employers.
Developing Precipitation Hardenable High Entropy Alloys
High entropy alloys (HEAs) is a concept wherein alloys are constructed with five or more elements mixed in equal proportions; these are also known as multi-principle elements (MPEs) or complex concentrated alloys (CCAs). This PhD thesis dissertation presents research conducted to develop precipitation-hardenable high entropy alloys using a much-studied fcc-based equi-atomic quaternary alloy (CoCrFeNi). Minor additions of aluminium make the alloy amenable for precipitating ordered intermetallic phases in an fcc matrix. Aluminum also affects grain growth kinetics and Hall-Petch hardenability. The use of a combinatorial approach for assessing composition-microstructure-property relationships in high entropy alloys, or more broadly in complex concentrated alloys; using laser deposited compositionally graded AlxCrCuFeNi2 (0 < x < 1.5) complex concentrated alloys as a candidate system. The composition gradient has been achieved from CrCuFeNi2 to Al1.5CrCuFeNi2 over a length of ~25 mm, deposited using the laser engineered net shaping process from a blend of elemental powders. With increasing Al content, there was a gradual change from an fcc-based microstructure (including the ordered L12 phase) to a bcc-based microstructure (including the ordered B2 phase), accompanied with a progressive increase in microhardness. Based on this combinatorial assessment, two promising fcc-based precipitation strengthened systems have been identified; Al0.3CuCrFeNi2 and Al0.3CoCrFeNi, and both compositions were subsequently thermo-mechanically processed via conventional techniques. The phase stability and mechanical properties of these alloys have been investigated and will be presented. Additionally, the activation energy for grain growth as a function of Al content in these complex alloys has also been investigated. Change in fcc grain growth kinetic was studied as a function of aluminum; the apparent activation energy for grain growth increases by about three times going from Al0.1CoCrFeNi (3% Al (at%)) to Al0.3CoCrFeNi. (7% Al (at%)). Furthermore, Al addition leads to the precipitation of highly refined ordered L12 (γ′) and B2 precipitates in …
Developing Variation and Melodic Contour Analysis: A New Look at the Music of Max Reger
Max Reger was a prolific composer on the threshold of modernism. The style of his extensive musical output was polarizing among his contemporaries. A criticism of Reger's music is its complex and dense musical structure. Despite writing tonal music, Reger often pushes the boundaries of tonality so far that all sense of formal organization is seemingly imperceptible. In this dissertation, I offer what I observed to be a new way of discerning Reger's motivic relationships and formal structures within and between movements. There are three primary tools and methods I incorporated to make these observations: Schoenberg's developing variation; melodic contour analysis as discussed by Elizabeth West-Marvin and Diana Deutsch; and Janet Schmalfeldt's motivic cyclicism stemming from internal themes. In this dissertation I examine five different musical works by Reger: D minor Piano Quartet, Clarinet Quintet, Piano Concerto, String Quartet, op. 121 and E minor Piano Trio, op. 102. My analysis shows how Reger relies on melodic contours of his motives to connect musical moments across entire movements and entire works with multiple movements. These motives are developed and often mark structurally significant moments providing the organization often perceived as missing in Reger's music.
Development and Analysis of a Mobile Node Tracking Antenna Control System
A wireless communication system allows two parties to exchange information over long distances. The antenna is the component of a wireless communication system that allows information to be converted into electromagnetic radiation that propagates through the air. A system using an antenna with a highly directional beam pattern allows for high power transmission and reception of data. For a directional antenna to serve its purpose, it must be accurately pointed at the object it is communicating with. To communicate with a mobile node, knowledge of the mobile node's position must be gained so the directional antenna can be regularly pointed toward the moving target. The Global Positioning System (GPS) provides an accurate source of three-dimensional position information for the mobile node. This thesis develops an antenna control station that uses GPS information to track a mobile node and point a directional antenna toward the mobile node. Analysis of the subsystems used and integrated system test results are provided to assess the viability of the antenna control station.
Development and Evaluation of a Large-scale Pyramidal Staff Training Program for Behavior Management
Training and empirically evaluating caregivers’ implementation of behavior management skills is a particularly challenging task in large residential contexts. A pyramidal training approach provides an efficient and effective way to conduct large-scale competency-based behavior skills training. The purpose of this project was to develop and evaluate a large-scale pyramidal staff training program for behavior management skills. One hundred nine caregivers and 11 behavior service professionals at a large, residential care facility participated in this project. Interobserver agreement was utilized to develop and refine measurements systems to detect caregiver acquisition of skills, behavior service professionals’ ability to score caregiver performance and behavior service professionals’ ability to deliver a specified portion of the curriculum. Pre- and post-test probes were conducted utilizing standard role play scenarios and checklists to evaluate caregiver acquisition of three specific behavior management skills. The results supported the following conclusions: first, interobserver agreement measures were useful to develop a reliable measurement system, to refine some curriculum elements, and to evaluate measurement conducted by behavior service professionals. Second, behavior skills training (BST) resulted in caregiver acquisition of all three behavior management techniques. Third, the pyramidal training approach was effective to teach behavior service professionals to deliver BST and accurately measure the performances of trainees.
Development and Integration of a Low-Cost Occupancy Monitoring System
The world is getting busier and more crowded each year. Due to this fact resources such as public transport, available energy, and usable space are becoming congested and require vast amounts of logistical support. As of February 2018, nearly 95% of Americans own a mobile cell phone according to the Pew Research Center. These devices are consistently broadcasting their presents to other devices. By leveraging this data to provide occupational awareness of high traffic areas such as public transit stops, buildings, etc logistic efforts can be streamline to best suit the dynamics of the population. With the rise of The Internet of Things, a scalable low-cost occupancy monitoring system can be deployed to collect this broadcasted data and present it to logistics in real time. Simple IoT devices such as the Raspberry Pi, wireless cards capable of passive monitoring, and the utilization of specialized software can provide this capability. Additionally, this combination of hardware and software can be integrated in a way to be as simple as a typical plug and play set up making system deployment quick and easy. This effort details the development and integration work done to deliver a working product acting as a foundation to build upon. Machine learning algorithms such as k-Nearest-Neighbors were also developed to estimate a mobile device's approximate location inside a building.
Back to Top of Screen