PRISM | Institutional Repository
Communities in PRISM
Select a community to browse its collections.
Recent Submissions
Evaluating the Phenotypic Consequences of DNA-PKcs Deficiency
(2025-01-14) Kenny, Jacey; Lees-Miller, Susan; Billon, Pierre; Moorhead, Gregory
DNA-dependent protein kinase catalytic subunit (DNA-PKcs) is a serine/threonine protein kinase with a well-established role in the repair of DNA double strand breaks. Similarly, Ataxia-Telangiectasia Mutated (ATM) is a protein kinase that plays a central role in downstream signalling response to DNA damage. DNA-PKcs and ATM are members of the phosphatidylinositol 3-kinase-like protein kinase (PIKK) family, which serve critical roles in the cellular response to DNA damage. To better understand the non-canonical roles of DNA-PKcs and ATM, our lab generated human cell lines with CRISPR/Cas9-mediated disruption of these proteins, marking the starting point of my project. My work demonstrates that loss of DNA-PKcs and/or ATM expression results in slower proliferation compared to parental cells, while inhibition of DNA-PKcs or ATM kinase activity had a comparatively minor effect on cell proliferation. My data suggests that this phenotype is not due to increased apoptosis, endogenous DNA damage or cell cycle defects. Comparison of the metabolite profiles of the matched control and CRISPR cell lines suggested enrichment of amino acids in the HeLa CRISPR DNA-PKcs cells which led me to hypothesize that the CRISPR cell lines have a reduced rate of protein translation. I present evidence that cells deficient in DNA-PKcs and ATM exhibit impaired global, new protein translation, revealing a potentially novel mechanism for the slow growth phenotype. This work adds to a growing body of evidence encouraging further exploration of DNA-PKcs and ATM as therapeutic targets in cancer and other diseases.
Correlating Obstructive Symptoms with Stricture Severity in Ileal Crohn’s Disease
(2024-11-28) Saunders, Bethany; Lu, Cathy; Bonni, Shirin; Novak, Kerri; Ma, Christopher; Ganesh, Aravind
Crohn’s Disease (CD) can manifest as inflammatory (non-stricturing), fibrostenotic (stricturing) and fistulizing or penetrating phenotypes. Stricturing CD, the main focus of this thesis, involves luminal narrowing and, which is often progressive leading to obstruction of the affected bowel segment. Obstructive symptoms (OS) in CD strictures can be assessed using the Stricture Definition and Treatment (STRIDENT) trial-used OS scoring system and CREOLE OS score. Clinically, OS are typically linked to advanced strictures, strictures can often result in dietary restrictions and surgery. No studies have been published to date, evaluating the severity of CD strictures and OS. Understanding the correlation between symptoms and strictures is crucial as regulatory bodies assess drug efficacy based on symptom response in CD trials. Intestinal ultrasound (IUS) accurately evaluates strictures morphology, with stricture severity and length, using established criteria: bowel wall thickness > 3mm, luminal narrowing (<1cm), with any pre-stenotic dilation (PSD) size (cm). The aim of this thesis based stsudy aims to determine if the OS of pain, bloating, nausea and/ or vomiting when comparing among CD phenotypes, and if specific OS correlate with any of the definitive stricture parameters.
Disease control practices used to prevent morbidity and mortality in preweaned beef calves
(2025-01-15) Sanguinetti, Virginia Margarita; Windeyer, Claire; Checkley, Sylvia; Adams, Cindy; Campbell, John; Morley, Paul; Smith, David
Calf morbidity and mortality negatively impact economic returns for cow-calf producers. Given this, preventing infectious diseases, including Neonatal Calf Diarrhea (NCD) and Bovine Respiratory Disease (BRD) is essential. However, recommended practices to attain this have not been recently summarized or updated. Therefore, the aim of this thesis was to study several aspects relevant to the implementation of disease control and update recommended practices. The objectives were to: i. summarize the scientific evidence on the effectiveness of practices in preventing health and mortality outcomes, ii. prioritize practices based on their usefulness in herds considering their effectiveness, ease of implementation, and economic feasibility, iii. assess the frequency of outbreaks, use of practices, the impact of practices on outbreaks across Canadian cow-calf herds, and the importance given to productivity parameters across regions, and iv. compile the evidence into a Calf Health Decision Tool to support discussions between producers and veterinarians and pilot it in Alberta, Canada. The work reported in Chapter 2 showed that the evidence of the impact of practices on mortality, regardless of the cause, was scarce. Only a few practices showed statistically significant associations. Herds that routinely intervened with colostrum or checked the fullness of the udder had a lower mortality risk than those not using these practices. Herds that calved early or during winter had a higher mortality risk than those calving later. Herds with longer calving seasons had a higher mortality risk than those with shorter seasons. Calves from herds that did not supplement with vitamin E and selenium at birth had higher odds of mortality than those from herds where this practice was used. Chapter 3 showed that most practices impacted both NCD and BRD. However, the evidence was of low and very low certainty. Chapter 4 found that veterinarians prioritized the effectiveness of a practice over its ease of implementation and economic feasibility. Vaccinating calves against clostridial disease and providing colostrum in case a calf had not nursed using an oesophageal tube or nipple bottle were practices considered always useful for all herds. Most practices were shown to have intermediate levels of usefulness in herds. Prophylactic and metaphylactic use of antibiotics were considered among the least useful. Yet, all practices that were considered at least very useful for some herds were deemed relevant enough to be included in a Calf Health Decision Tool. Chapter 5 found that over 40% of herds had at least one type of outbreak during the last three calving seasons. Also, it was demonstrated that eastern and western herds managed their cow-calf herds differently. Some frequently used practices were shown to increase the odds of having outbreaks. It was also found that western and eastern producers gave similar importance to several productivity parameters of their herds. In Chapter 5, it was revealed that the Tool was useful for facilitating discussions between producers and veterinarians regarding disease control. Responses showed that delaying the calving season for early calving herds was not feasible. Most herds could only follow only one of two recommended practices, either calving heifers before cows or calving seasons shorter than 80 days. Therefore, there might be an incompatibility between these recommendations. Most producers were willing to consider using some method to segregate calves by age. Given this, this could become a more widespread recommended practice. Only half of the herds vaccinated dams against NCD, and even one herd with a history of NCD outbreaks was not doing this. This may be because producers felt reluctant to manage pregnant dams in the chute. However, they were more willing to consider calf vaccination instead. The findings presented in this thesis reveal a general lack of consistent evidence proving the effectiveness, ease of implementation, and economic feasibility of practices. Overall, the work in this thesis showed that many commonly used practices make herds vulnerable to outbreaks, and thus, tailoring disease control practices to the operation is essential. Therefore, The Calf Health Decision Tool created may help support discussions between veterinarians and producers to implement this and prevent disease and mortality.
Building Better from LEED to Living: An Approach to Net Zero Water Management
(2024-12) Fatima, Kulsum; Assefa, Getachew; Tyler, Mary Ellen; Hilmi, Tawab
In view of the increased concerns about the United Nations Sustainable Development Goals (UNSDGs), there is a need to minimize water-related challenges and maximize water security and availability through sustainable green building practices. As significant water consumers, university campuses play a crucial role in addressing water-related challenges. By focusing on water-efficient buildings to achieve net-zero water management and exploring the influence of green building practices, universities can contribute to the UNSDGs, particularly in promoting sustainable water management, climate action, and the development of resilient communities. This thesis explores the influence of green building practices on water management by comparing objectives, directives, and applications suggested by the LEED certification system and the Living Building Challenge (LBC) certification system. It aims to understand the advantages and challenges in transitioning UCalgary from LEED-guided water-efficient operations to LBC-compliant net-zero water operations. This research evaluates best practices from three design precedent sites: CIRS at the University of British Columbia, Bullitt Center with the University of Washington, and Kendeda Building at Georgia Institute of Technology. For this purpose, two evaluation frameworks are used: Performance Assessment Framework (PAF) and Water Literacy Assessment Framework (WLF). UCalgary and design precedent sites are evaluated using the PAF and WLF frameworks to identify challenges and barriers to water performance efficiency and improve water management practices. This research hopes to encourage decision-makers and practitioners at higher education institutions (HEIs) to achieve net-zero compliance as per LBC, minimize performance gaps and water-related challenges, motivate water managers to develop an operational net-zero water scenario, and incentivize water users to support this scenario by promoting good water use behaviours on HEI campuses. This research provides insights into the factors influencing the adoption of LEED and LBC at higher education institutions. It also highlights the challenges and barriers project teams involved in LEED and LBC applications face. Additionally, the research identifies problem situations related to selecting and determining water strategies for LBC compliance. The understanding gained from this research is valuable for addressing the complexities of water management and promoting the appreciation and value included in the joy and everyday understanding of water.
Integrating Queue Dynamics into the Trip-Based Macroscopic Fundamental Diagram
(2025-01-15) Hassanin, Omar; Kattan, Lina; Demissie, Merkebe Getachew; He, Jianxun (Jennifer)
Macroscopic traffic models provide a simplified framework for analyzing and controlling traffic at the network level. Among these models, the trip-based macroscopic fundamental diagram (MFD), or generalized bathtub model, effectively captures inflow, accumulation, and outflow while considering trip lengths and travel times, particularly under rapidly changing traffic conditions. This study addresses a gap in the trip-based MFD, which lacks queuing dynamics under downstream restrictions, as its original outflow function assumes unrestricted flow. To resolve this, the research incorporates downstream queuing dynamics, accounting for waiting times and their changes over time. The study also examines the impact of connected and automated vehicles (CAVs) on traffic dynamics through numerical simulations. Assuming that CAVs improve the MFD shape and bottleneck capacity, a sensitivity analysis was conducted for various market penetration rates (MPRs). Key findings include: 1) Average upstream waiting time decreased from 130 seconds to 0 at 60% MPR; 2) Average reservoir travel time reduced by 60% (570 to 230 seconds) at 100% MPR; 3) Average downstream waiting time initially increased by 25% at 70% MPR but fell by 12.5% at 100% MPR; and 4) Average total trip time (TTT) dropped by 61%, from 780 to 300 seconds. The results show earlier commuter exits, shifting the peak time (𝑡𝑝) of TTT. However, as 𝑡𝑝 depends on external factors (e.g., work schedules), the inflow demand pattern was adjusted to maintain 𝑡𝑝 constant, revealing a 26-minute inflow peak shift at 100% MPR. Additionally, congestion duration reduced by 51%, from 96.6 to 46.6 minutes, while maximum travel time decreased by 66% (18.5 to 9.5 minutes). Lastly, previous enhancements in capacities maintained constant free-flow speed (𝑣𝑓) for safety reasons. However, when the MPR reaches 100% and there are no longer any human-driven vehicles, increasing 𝑣𝑓 no longer poses safety risks. Thus, 𝑣𝑓 can reach 21m/s instead of 15 m/s, leading to less congestion. The findings demonstrate that CAVs significantly reduce trip time and congestion but cannot alone eliminate the negative impacts of downstream bottlenecks.