Eläinlääketieteellinen tiedekunta


Recent Submissions

  • Tirkkonen, Birger Taneli (Helsingin yliopisto, 2017)
    More than 150 species of mycobacteria are described, most being opportunistic pathogens and all representing a risk for human and animal health. Human infections derived from environmental mycobacteria are increasing in both industrialized and developing countries. The most susceptible groups are children, the elderly and those, including animals, with immunocompressive conditions. Drug therapy for mycobacteriosis is difficult and not always successful. Infections caused by drug-resistant mycobacteria can be life threatening also for healthy adults and thus represent a real risk for humans. Environmental mycobacterial infections of pigs are usually without clinical signs and the lesions are mainly detected at slaughter. Mycobacterium-infected pork can pass for human consumption due to the poor sensitivity of visual meat control at slaughterhouses, and mycobacteria in pigs also cause economic losses due to condemnation of carcasses. The main challenge is represented by evaluation of the hygiene risk associated with using mycobacteria-contaminated pork. Most environmental mycobacteria species have been isolated from sources such as water, swimming pools, soil, plants and bedding material. In our study mycobacterial growth in piggeries was identified in all bedding materials, sawdust, straw, peat and wood chips in most cases, and water and food samples in many cases, and only occasionally in dust and on wall surfaces. The maximum number of mycobacteria was almost 1 billion (109) per gram of bedding, which is close to the maximum concentration in any growth media. Mycobacteria can multiply in piggeries and contaminate feed and water. Isolation of mycobacteria from pig faeces can be considered an indicator for risk of human infection. Environmental mycobacteriosis in humans and pigs is mainly caused by M. avium subsp. hominissuis. There is little evidence of direct transmission from animals to humans, but particular strains can be recovered from both humans and pigs. In our studies, identical mycobacteria RFLP and MIRU-VNTR fingerprints of porcine and human origins were evident. Interspecies clusters were more common than intraspecies clusters using both methods. Therefore, we concluded that pigs act as a reservoir for virulent M. avium strains and the vector for transmission of infections in humans to pigs, and vice versa, may have an identical source of infection. Culturing mycobacteria is the gold standard for diagnosis, but detection of environmental mycobacteria based on cultivation and biochemical methods can take several weeks. Culture-independent, rapid and accurate techniques for detecting mycobacteria in food and feed chains are urgently needed. In this work we developed a rapid and accurate real-time quantitative PCR for detecting environmental mycobacteria in bedding materials and pig organs. Conclusion: Mycobacteria can multiply in bedding materials and the consequent heavy contamination can cause simultaneous infections in pigs. Mycobacterial DNA was found in pig organ samples, including those without lesions, and similar strains were found from humans and pig organ samples, which suggests that mycobacteria can be transmitted between humans and pigs.
  • Grönthal, Thomas (Helsingin yliopisto, 2017)
    Staphylococcus pseudintermedius (SP) is part of the normal microbiota of dogs and cats. Since the mid-1980s, an ever-increasing number of methicillin-resistant SP (MRSP) isolates have been reported. In the mid-2000s, two predominant MRSP clones, ST71 (sequence type 71) and ST68, spread through Europe and North America, respectively. MRSP isolates are commonly multidrug resistant (MDR), and are thus capable of causing infections that do not respond to routinely used antimicrobials. MRSP appeared in the small animal population of Finland in the late 2000s, also causing numerous infections at the Veterinary Teaching Hospital (VTH) of the University of Helsinki. No data on the epidemiology of MRSP in Finland have been published. This thesis study aimed to explore the epidemiology of MRSP in the Finnish small animal population. This was done by investigating and describing the MRSP outbreak at the VTH, and investigating risk factors for patients being colonized or infected by MRSP in the hospital during the outbreak. The prevalence of MRSP and the risk factors for MRSP carriage were investigated in a canine subpopulation at the Guide Dog School for the Visually Impaired. The susceptibility of SP isolates to antimicrobials in 2011–2015 was also investigated using data from the Clinical Microbiology Laboratory (CML) of the Faculty of Veterinary Medicine, University of Helsinki. Risk factors for an SP isolate being MRSP, as well as for a screening specimen revealing MRPS, were also investigated. Furthermore, the molecular epidemiology of all MRSP isolates stored in 2010–2014 was investigated using pulsed-field gel electrophoresis (PFGE), multi-locus sequence typing (MLST) and staphylococcal cassette chromosome mec (SCCmec) typing. Antimicrobial therapy, whether previous or ongoing during sampling, was a risk factor for MRSP in all studies. Furthermore, a prolonged hospital stay and veterinary visits were risk factors among guide dogs. An SP isolate originating from a private clinic (versus the VTH) was a significant risk factor for MRSP among clinical specimens. The same could be seen among screening specimens from patients with risk factors for MRSP. In addition, it was noted that a sizeable proportion (~20–60%) of animals in the studies had been or were being treated with antimicrobials. During the outbreak at the VTH, rigorous hygiene and barrier measures were necessary to achieve control. ST71, the MRSP clone that caused the outbreak, was the predominant clone in 2010–2011, accounting for over 50% of MRSP isolates, even among non-outbreak-related isolates. By 2014 the situation had changed, as ST71 represented only ~10% of MRSP isolates. MRSP clones belonging to CC45 (clonal complex 45) and CC258, and other unrelated STs, dominated the MRSP population by that time. SCCmec type IV was detected in a majority of different STs, indicating the horizontal spread of resistance genes. The prevalence of MRSP was only 3% among guide dogs. The proportion of clinical specimens from small animals that revealed MRSP was similar, 2.5%. However, 9% of screening specimens from high-risk patients revealed MRSP. Overall, 14% of SP isolates were MRSP. Roughly 30–40% of isolates were not susceptible to alternative antimicrobials, such as lincosamides, macrolides, or tetracyclines. MRSP in feline specimens was rare (<1% in all specimens after 2011). Our results give further credence to the hypothesis that antimicrobial therapy and contact with the veterinary environment are risk factors for MRSP in small animals. However, cats do not appear to be a significant source of MRSP. Our data suggest that the epidemiology of MRSP has changed from a predominantly clonal spread to a mix of clonal spread and the spread of genetic elements. The resistance rates among SP are at an alarming level. Decisive action, including the use of non-antimicrobial treatments whenever feasible and more prudent use of antimicrobials, is required to improve the situation.
  • Mikkola, Marja (Helsingin yliopisto, 2017)
    Multiple ovulation and embryo transfer (MOET) has been established in cattle breeding since the 1970s. It is an efficient means to increase the number of offspring from genetically superior females. Despite nearly 50 years of development, the average number of transferable embryos recovered in a single embryo collection has remained nearly constant at approximately six embryos per donor. Several animal-related, environmental and management factors contribute to the outcome of superovulation and embryo recovery. The most prominent factor affecting the success of superovulation is an animal-related attribute: the ovarian follicular population responsive to exogenous gonadotropin stimulation. Environmental factors, such as heat stress or other external factors causing stress, can compromise the embryo yield after superovulation. However, such factors represent a management challenge. The superovulatory outcome is additionally affected by several factors that can conflict with management decisions, including nutritional management of the donor, gonadotropin treatment protocol and semen and technical performance of donor inseminations. The purpose of the work presented in this thesis was to investigate management factors that affect the efficacy of MOET. First, the effect of nutritional protein in the form of rapeseed meal on superovulation of dairy heifers was studied. One-year old heifers were fed diets formulated to meet energy requirements for 800 g daily gain and crude protein either at 14% or 18%, which was higher than the feeding recommendations and the common practice on farms. There was no effect of the higher protein level on the ovulation rate, total number of embryos recovered or the number of transferable embryos. Feeding an energy-adequate diet containing moderate or high protein with respect to feeding recommendations resulted in comparable embryo yields. The efficacy of two commercial FSH products was compared in a retrospective study on superovulations of heifers and cows on Finnish dairy farms and an embryo collection station. A highly purified porcine FSH with a low LH:FSH ratio, Folltropin, was used for 2592 superovulations, and Pluset, containing equal amounts of LH and FSH, was used for 1398 superovulations. Pluset-treated donors had a higher ovulation rate, yielding 1.1 recovered structures (embryos and ova) more than those receiving Folltropin. However, the difference was characterized by more unfertilized ova (UFO). For transferable embryos, the number, quality and developmental stage were similar for both preparations. Therefore, it can be concluded that the efficacy of the preparations is comparable. The effect of sex-sorted semen on efficacy of MOET was investigated from a dataset of commercial embryo collections and transfers. A total of 443 embryo collections produced with sex-sorted semen and 1528 produced with conventional semen were analyzed. Sex-sorted semen decreased the number of transferable embryos and increased proportions of UFO and degenerated embryos, compared with non-sorted semen. The decrease was more evident in cows than in heifers. The proportion of poor quality embryos was higher and there was a slight delay in the embryo developmental kinetics for sexed embryos. The risk of recovering no transferable embryos was increased when sex-sorted semen was used. Pregnancy rates after transfer of embryos produced with sex-sorted semen were 12% lower than for embryos originating from conventional semen. It can be concluded from these studies on sexed semen that the use of sex-sorted semen is profitable because more female calves can be produced from a donor heifer, wasting less recipient resources. For superovulated cows, equal numbers of female calves can be produced per embryo collection, but the need for only half the number of recipients compared with using conventional semen favors the use of sexed semen when female progeny are desired.
  • Kaukonen, Eija (Helsingin yliopisto, 2017)
    Contact dermatitis in broilers is a multifactorial condition that is most commonly caused by poor litter quality or otherwise unsuitable material affecting the footpad or hock skin. Footpad health is mainly maintained by keeping litter in a dry and friable condition. Hence, footpad lesions reflect litter quality that, more widely, describes the housing conditions and bird health. The evaluation of the prevalence of contact dermatitis denotes a commonly accepted approach to assess the welfare of broiler flocks. However, there is lack of knowledge about footpad lesions in broiler breeders. Although numerous studies on the effect of litter materials on footpad condition have been conducted, experiments with peat are scarce. Also, knowledge of the influence of peat on hock burns and litter quality is lacking. Modern fast-growing broilers spend excessive time resting and this inactivity has been suggested to increase the incidence of impaired gait and leg disorders. Tibial dyschondroplasia (TD) is one of the most common leg pathologies in broilers. Perches or elevated platforms add complexity to the broilers’ environment and may stimulate locomotion. However, research on the use of elevated structures under commercial rearing conditions and possible benefits for broiler leg health is limited. This thesis provides descriptive information about contact dermatitis and breast blisters in broiler breeders throughout the production period with respect to litter condition. Secondly, the study compared the influence of peat bedding with wood shavings and ground straw (fine crushed straw) on contact dermatitis and plumage cleanliness in fast-growing broilers and litter condition in commercial broiler houses. Furthermore, the study examined the use of perches and elevated platforms by broilers, and the impact of the additional equipment on contact dermatitis, plumage cleanliness, walking ability, the occurrence of TD and litter conditions under intensive rearing circumstances. Litter condition in broiler and breeder houses was evaluated according to the Welfare Quality® (WQ) protocol for broiler chicken. Additionally, litter height was measured, and litter quality determined according to moisture, pH and ammonia content. Footpad condition was visually inspected with the WQ-scoring method (broilers), the official Finnish system (broilers) or employing a method modified from the official system (breeders). Hock skin lesions and plumage cleanliness were assessed according to the WQ-protocol. Broiler gait was scored before slaughter following the WQ-protocol. The severity of TD was determined. The use of perches and platforms was monitored by video recording. Additionally, farmers estimated the platform and perch usage twice a week throughout the growing period. The condition of breeder footpads deteriorated towards the end of the production period, with the occurrence of severe lesions reaching a maximum of 64% on average at slaughter. However, hock burns and breast blisters were rarely recorded. The litter layer became drier over time. Although dry and friable litter in breeder houses was associated with healthier footpads, other factors were of greater importance, as footpad lesions, particularly severe lesions, appeared more often towards slaughter age. Broiler footpads were generally in good condition at slaughter age, 80% of the birds having healthy footpads. In broilers, hock burns were more frequently detected than footpad lesions. Inferior footpad and hock skin health was scored on wood shavings rather than on peat, without differences in litter condition and moisture. Moreover, the lack of difference in moisture between ground straw and peat still resulted in poorer litter, footpad and hock skin condition on ground straw. Farms differed for footpad and hock burn condition, and litter quality. In risk analysis, the impact of farmer on contact dermatitis severity exceeded the effect of litter quality. The platforms were used frequently while only single birds used perches. The study indicated no effects of platform treatment on footpad and hock skin health, and litter condition. The birds with access to platforms, however, had enhanced leg health: mean gait score, the percentage of birds scored 3, and TD percentage and severity were lower for birds in platform-equipped houses. Access to platforms most likely enables more versatile movement, such as walking forward, up and down, grasping by feet, and jumping, which may promote leg health and gait. This was the first study to follow footpad health in broiler breeders through the whole production period. The results indicate the need for further investigation because good litter condition alone appears insufficient to keep breeder footpads healthy for their entire life. Further, this thesis provides new knowledge about the applicability of peat as broiler bedding. According to our results, regarding footpad health, peat seems to be the optimal litter material for Finnish conditions. Furthermore, the study underlines the importance of farmer ability to manage litter conditions, regardless of the chosen litter material. Hock burn monitoring could represent a more sensitive indicator of litter condition and possibly also signal leg health status, therefore monitoring hock burns at slaughter should be considered. The advantages of traditional perches for broilers should be re-evaluated as they remained largely unused. However, the extensive use of platforms suggests that broilers are motivated to perch on elevated structures. Hence, platform availability could enhance their emotional wellbeing. Elevated platforms offering additional possibilities for locomotion seem promising because they show apparent potential to enhance leg health without compromising litter condition or footpad health. Based on all these findings, elevated platforms with ramps can be recommended as a way forward to enhance broiler welfare in commercial environments.
  • Kantala, Tuija (Helsingin yliopisto, 2017)
    Hepatitis E virus (HEV) infections are common in both humans and animals. HEV genotypes 1 and 2 (HEV-1 and HEV-2) only infect humans and are endemic to Asia, Africa, and Central America, where they cause large, usually waterborne hepatitis epidemics. HEV-3 and HEV-4 are zoonotic and in addition to humans, they also infect animals. Especially HEV-3 is common in swine globally. The porcine infection is usually asymptomatic. In humans, HEV-3 and HEV-4 infections are often asymptomatic or only cause mild symptoms, but they can also cause chronic hepatitis that can lead to liver fibrosis and cirrhosis and even death in immunocompromised patients. The aims of this work were to investigate occurrence of HEV and antibodies against HEV in human patients with unexplained acute hepatitis, in veterinarians, and in production pigs in Finland. Antibodies against HEV were present in 27.6% of human patients diagnosed with acute non-A C hepatitis and as a marker for acute hepatitis E infection, anti-HEV IgM antibodies in 11.3% of the patients. All HEV isolates obtained from the patients belonged to HEV-1. Most of the patients with acute infections had recently visited HEV-1 endemic areas in Asia, Africa, or Mexico, indicating that their infections were obtained during travels. However, the possibility of infections acquired in Finland could not be excluded, since no traveling data were available for several HEV-positive patients. Of all production pigs of different ages investigated, 15.5% were positive for HEV RNA and 86.3% for antibodies against HEV. Longitudinal follow-up studies on pigs revealed that the pigs were infected with HEV at the age of 2 3 months, when the prevalence of HEV RNA-positive pigs was at its peak, 34.6%. Thereafter, the prevalence of HEV RNA-positive pigs declined to 21.1% at 3 4 months of age and to 2.9% in slaughter-aged pigs. High HEV antibody seroprevalences of over 80% were detected in all age groups tested, from weaner-aged pigs to sows. All HEVs from pigs were of HEV-3, subtype e. Genetically separate clusters of HEV isolates were obtained from different swine farms, suggesting that genetic variations in viruses from different locations occur. In addition, two different isolates were obtained from the same farm, and also HEV-negative swine farms were seen. The pigs were commonly shedding HEV at the time they were transferred from farrowing farms to fattening farms, creating a possible risk of zoonotic infection for pig handlers. When pigs from HEV-negative and HEV-positive farms arrive at the same fattening farm, infection at a later age during the fattening stage must also be considered possible, which constitutes a risk for HEV entering the food chain in pork at the time of slaughter. With apparent anti-HEV antibody prevalence of 10.2%, Finnish veterinarians commonly have antibodies against HEV. HEV seropositivity was unexpectedly associated with working as a small animal practitioner and negatively associated with having contacts with swine. However, contradictory to swine contacts, the seroprevalence appeared to be higher in those who had had needle stick by a needle that had previously been injected into a pig than in those who had not, suggesting that contact with blood or tissue fluid from swine might be a risk factor for HEV infection in veterinarians. In addition, those small animal practitioners who had traveled outside Europe during the previous five years appeared to be more often seropositive than those who had not, suggesting that some infections might have been travel-related. Although pigs seem to play a role in the hepatitis E infections of veterinarians, there are possibly multiple factors involved, including also other reservoirs of HEV than pigs. Hepatitis E virus must be considered a possible cause of acute hepatitis in humans in Finland, especially in patients who have returned from areas endemic to HEV-1 and HEV-2. Although no cases of possibly zoonotic HEV-3 infections acquired in Finland were detected in humans in this study, their possibility should not be overlooked since HEV is widespread in production pigs in Finland and routes for zoonotic infection exist.
  • Pohjola, Leena (Helsingin yliopisto, 2017)
    There is an increasing interest in keeping backyard poultry in many countries, including Finland. However, several studies in Western Europe and North America have identified the involvement of backyard poultry flocks in avian influenza virus outbreaks occurring in commercial poultry. In addition, commonly without any signs of illness, poultry can be carriers of enteric bacterial agents that are human pathogens. Farm management and biosecurity practices among 178 backyard poultry flocks were investigated using a questionnaire. Furthermore, the main causes of mortality of backyard chickens were studied through a retrospective study of necropsy data from the Finnish Food Safety Authority Evira from 2000 to 2011. In addition, voluntary backyard poultry farms were visited during October 2012 and January 2013, and blood samples, individual cloacal samples as well as environmental boot sock samples were collected from 51 farms and 457 chickens. The results of the questionnaire study revealed that the backyard poultry farms in Finland were mainly small (91 % ≤ 50 birds) and most flocks (98 %) had access to outdoors. Biosecurity practices, such as hand washing and changing shoes after bird contact were rare, 35 % and 13 % respectively. The farms were mainly located distantly (94 % > 3 km) from commercial poultry farms. The subjectively reported flock health was good (96 %). The most common postmortem diagnosis were Marek s disease (27 %) and colibacillosis (17 %). Of the zoonotic bacterial pathogens, Campylobacter jejuni and Listeria monocytogenes were frequently detected on the farms, 45 % and 33 %, respectively. Yersinia enterocolitica was also often isolated on the farms (31 %); however, all isolates were yadA negative, i.e. non-pathogenic. C. coli, Y. pseudotuberculosis and Salmonella enterica were rarely detected (2 %). All enteric bacteria were highly susceptible to most of the antimicrobials studied and only few AmpC- and no ESBL-producing E. coli were found. Avian encephalomyelitis virus, chicken infectious anemia virus and infectious bronchitis virus (IBV) antibodies were commonly found from the studied flocks, 86 %, 86 % and 47 %, respectively. The IBV detected from backyard poultry flocks were QX-type IBV strains differing from the strains found from commercial farms, suggesting different routes of infection for commercial and backyard poultry. The results indicated that among backyard poultry flocks pathogens circulate posing a risk to transmit infection to commercial poultry in Finland, but because of the distant locations and small flock sizes, the risk is relatively small. Notifiable avian diseases that also are of zoonotic potential (AIV and NDV) are very rare. Backyard chickens are a reservoir of C. jejuni strains and thus a potential source of C. jejuni infection for humans. Because of the lack of good hygiene after bird contact, the risk of transmission of the pathogen from birds to humans exists
  • Selby, Katja (Helsingin yliopisto, 2017)
    Clostridium botulinum, the causative agent of botulism in humans and animals, is frequently exposed to stressful environments during its growth in food or colonization of a host body. The wide genetic diversity of the strains of this foodborne pathogen has been thoroughly studied using different molecular biological methods; however, it is still largely unknown how this diversity reflects in the ability of different C. botulinum strains to tolerate environmental stresses. In contrast to cold tolerance, which has been the focus of intensive research in recent years, the molecular mechanisms C. botulinum utilizes in response to heat shock and during adaptation to high temperature stress are poorly understood. The aims of this study were to investigate the strain variation of Group I and II C. botulinum with regard to growth at low, high, and optimal temperature; the roles of hrcA, the negative regulator of Class I heat shock genes (HSG) and dnaK, a molecular chaperone coding Class I HSG, in the response of the Group I C. botulinum strain ATCC 3502 to heat and other environmental stresses; and the molecular mechanisms this strain employs in response to acute and prolonged heat stress. The maximum and minimum growth temperatures of 23 Group I and 24 Group II C. botulinum strains were studied. Further, maximum growth rates of the Group I strains at 20, 37, and 42°C and of the Group II strains at 10, 30, 37, and 42°C were determined. Within their groups, the C. botulinum strains showed significant variation in growth-limiting temperatures and their capability to grow at extreme temperature, especially at high temperature. Largest strain variation was found for Group I within type B and for Group II within type E strains, which further showed more mesophilic growth tendencies than the other Group II strains. However, the genetic background of the selected C. botulinum strains reflected only weakly in their growth characteristics. Group I strains showed larger physiological variation despite being genetically more closely related than Group II. A number of strains of both groups showed faster growth at temperatures above than at their commonly assumed optimal growth temperatures of 30°C for Group II and 37°C for Group I strains. In addition, they possessed higher maximum growth temperatures than the average of the studied strains. These strains can be expected to have higher than assumed optimal growth temperatures and pronounced high temperature stress tolerance. Good correlation was detected between maximum growth temperatures and growth rates at high temperature, although not for all strains. Therefore direct prediction from one studied growth trait to the other was impossible. These findings need to be taken into account when estimating the safety of food products with regard to C. botulinum by risk assessment and challenge studies. The role of Class I HSGs in C. botulinum Group I strain ATCC 3502 was studied by quantitative real-time reverse transcription PCR and insertional inactivation of the Class I HSGs hrcA and dnaK. During exponential and transitional growth, Class I HSGs were constantly expressed followed by down-regulation in the stationary phase. Exposure of mid-exponentially growing culture to heat shock led to strong, transient Class I HSG up-regulation. Inactivation of hrcA resulted in over-expression of all Class I HSGs, which confirmed its role as negative regulator of Class I HSGs in C. botulinum. Both inactivation mutants showed impaired high temperature tolerance as indicated by reduced growth rates at 45°C, a reduced maximum growth temperature, and increased log-reduction after exposure to lethal temperature. The growth of the dnaK mutant was more strongly affected than that of the hrcA mutant, emphasizing the importance of the molecular chaperone DnaK for C. botulinum. Reduced growth rates were evident for both mutants under optimal conditions and heat stress, but also under low pH, and high saline concentration. This suggests a probable role for Class I HSG in cross protection of C. botulinum against other environmental stresses. C. botulinum ATCC 3502 was grown in continuous culture and exposed to heat shock followed by prolonged high temperature stress at 45°C. Changes in the global gene expression pattern induced by heat stress were investigated using DNA microarray hybridization. Class I and III HSGs, as well as members of the SOS regulon, were employed in response to acute heat stress. High temperature led to suppression of the botulinum neurotoxin coding botA and the associated non-toxic protein-coding genes. During adaptation and in the heat-adapted culture, motility- and chemotaxis-related genes were found to be up-regulated, whereas sporulation related genes were suppressed. Thus, increase in motility appeared to be the long-term high-temperature stress-response mechanism preferred to sporulation. Prophage genes, including regulatory genes, were activated by high temperature and might therefore contribute to the high temperature tolerance of C. botulinum strain ATCC 3502. Further, remodeling of parts of the protein metabolism and changes in carbohydrate metabolism were observed.
  • Viitanen, Sanna (Helsingin yliopisto, 2017)
    Bacterial pneumonia (BP) is an acquired inflammation of the lower airways and lung parenchyma secondary to bacterial infection. BP is difficult to induce experimentally in healthy dogs; the pathogenesis is therefore considered complex, involving several underlying mechanisms. BP was first described in dogs decades ago, but it is still one of the most common systemic bacterial infections in dogs, with a significant morbidity and mortality. Several aspects of BP, including the applicability of inflammatory biomarkers in its diagnosis and follow-up as well as the role of respiratory viruses in its clinical picture and development, warrant further studies. This thesis aimed to describe clinical findings during the disease and recovery periods in dogs with BP and to evaluate the applicability of acute-phase proteins as diagnostic and follow-up markers in BP. The prevalence and role of viral co-infections in dogs with BP were also investigated. We evaluated the diagnostic applicability of serum C-reactive protein (CRP) and noted that CRP is significantly elevated in BP relative to dogs with other lower respiratory tract diseases, such as chronic bronchitis, bacterial tracheobronchitis, canine idiopathic pulmonary fibrosis, and eosinophilic bronchopneumopathy, as well as in cardiogenic pulmonary edema. Our results indicate that serum CRP concentration may be used as an additional biomarker in the diagnosis of canine BP. Serum CRP, serum amyloid A (SAA), and haptoglobin (Hp) were followed during the disease and recovery periods. The follow-up study showed that serum CRP and SAA reflected well the recovery process and declined rapidly after initiation of successful therapy and could therefore be used as markers of treatment response in dogs with BP. Currently, markedly longer antibiotic courses are recommended in dogs with BP than in humans with pneumonia. Since serum CRP is a sensitive inflammatory biomarker, it was hypothesized that normalization of serum CRP could be used as an indicator for the cessation of antimicrobial therapy. In our study, we treated a group of dogs according to conventional recommendations. In another group, antimicrobial therapy was ended 5-7 days after CRP normalization. When the normalization of CRP was used to guide antimicrobial therapy, treatment length was significantly reduced without increasing the number of relapses. According to these results, normalization of serum CRP may be applied to guide the length of antimicrobial therapy in dogs with BP. Respiratory viruses, primarily canine parainfluenza virus, were found frequently in lower respiratory tract samples in dogs with BP. This indicates that viruses may play an important role in the etiology and pathogenesis of BP. Viral co-infections did not affect disease severity or clinical variables. Our findings add new knowledge about the natural course of BP as well as about the possible applications of acute phase protein measurements in the diagnosis and follow-up of BP. The utilization of acute phase protein measurements may allow a more precise diagnosis of BP, enable the early identification of patients with a poor response to treatment, and diminish the use of antimicrobial drugs.
  • Cheng, Jing (Helsingin yliopisto, 2016)
    The interactions between a host and his/her microbiota have co-evolved over time and they exert profound effects on each other. Intestinal microbiota has been linked with a number of diseases, such as irritable bowel syndrome; it is considered to be a major etiopathological factor since it can alter intestinal homeostasis. However, the role of intestinal microbiota, especially commensals, is unclear in celiac disease. To date, most efforts for detecting potential microbial changes affected by celiac disease have focused on adult individuals and have examined fecal materials, although it is known that early life is the critical period for the microbiota to colonize and establish their niche in the human intestine. At this time in healthy individuals, there is continuing cross-talk with the host e.g. via the immune system, leading to the establishment of homeostasis in both metabolic and immunological programming. Since the intestinal epithelium is the main interface for host- microbe interactions, the role of mucosa-associated microbiota may be distinct from that of fecal microbiota, but both the normal fluctuations in intestinal microbiota and the composition of duodenal mucosa-associated microbiota are still not fully clarified. The aims of thesis were to characterize the development and stability of intestinal microbiota in healthy young children and to compare the microbial features between children and adults. Furthermore, the aim was to investigate host-microbe interactions in celiac disease by studying duodenal mucosa-associated microbial signatures and mucosal gene expression in healthy children and their counterparts with celiac disease. The microbiota profiles were characterized by using the human intestinal tract chip (HITChip), which is a bacterial phylogenetic microarray. The amounts of Bifidobacterium spp. in children and adults were verified with real time qPCR. The levels of mucosal gene expressions were quantified with reverse transcriptase quantitative PCR. The results showed that intestinal microbiota is not fully matured at the age of five in children. A common core microbiota, including several butyrate-producing bacteria, was identified in children and it was developing towards core microbiota found in adults. The different progression pattern of major bacterial taxa may reflect the physiological development steps in children. Moreover, differences were observed between healthy- and celiac disease- associated microbial signatures. The differences may reflect changes in epithelial integrity associated with the disease. On the other hand, the studies on both microbiota and mucosal gene expression indicated that the persistently enhanced Th1 type immune responsiveness in subjects with celiac disease after treating with gluten-free diet might result from the increased expression of TLR9, which recognizes unmethylated CpG motifs in bacterial DNA via the direct stimulation of immune cells and/or intestinal epithelial cells. The results of this thesis project suggest that specific symbiotic and dysbiotic microbial signatures may provide potential functional diagnostic or therapeutic targets for promoting healthy/natural microbiota development. Long-term studies in a controlled environment with an adequate number of participants will be necessary to decode the disturbed microbial signatures. These trials should be combined with systematic pathological surveillance to reveal how the changes in the microbiota influence the onset of disease.
  • Viljamaa-Dirks, Satu (Helsingin yliopisto, 2016)
    Crayfish plague is a severe disease of European crayfish species and has rendered the indigenous crayfish populations vulnerable, endangered or even extinct in the most of Europe. Crayfish plague is caused by an oomycete Aphanomyces astaci, a fungal-like water mould that lives its vegetative life in the cuticle of crayfish and infects other crayfish by producing zoospores. Zoospores swim around for a few days in search of crayfish, and when they find one they attach onto its surface, encyst and germinate to start a new growth cycle as new growing hyphae penetrate the crayfish tissues. Unrestricted growth of A. astaci leads to the death of the infected animal in just a few weeks. Crayfish plague induced mortalities started in Italy around 1860. Although the disease was known about since 1860 its cause remained unknown for several decades. Little was done to prevent the spread of the disease. A lively crayfish trade probably facilitated the spread of the crayfish plague, which reached Finland in 1893. The crayfish plague has remained the most important disease problem of the Finnish noble crayfish Astacus astacus, since then. The consensus was that the disease killed all infected animals in a short time, and it appeared almost impossible to restore the flourishing crayfish populations to the levels that existed before. Following the example of neighbouring Sweden, a North American crayfish species, the signal crayfish Pacifastacus leniusculus that appeared resistant to crayfish plague was introduced to Finland in 1960s. As expected, the signal crayfish slowly started to replace the lost populations of the noble crayfish to become an important part of the crayfish fisheries. The introduction of the signal crayfish significantly added to the management problems of the noble crayfish stocks left. Signal crayfish appeared to be a chronic carrier of the crayfish plague agent, and spread the disease to the dwindling vulnerable noble crayfish populations. Later research showed that the crayfish plague agent is a parasite of North American crayfish that in normal circumstances does not harm the host animal. Intriguingly, the crayfish plague agent carried by the signal crayfish, genotype Ps1, is different from the pathogen originally introduced into Europe, genotype As. The diagnosis of crayfish plague especially when based on the isolation of the pathogen is challenging and accordingly the genotype difference was mostly unrecognized until recently. In this study we determined the genotype of the causative agent from most of the detected Finnish crayfish plague cases between 1996 -2006. It appeared that most of the epidemics in the immediate vicinity of signal crayfish populations were caused by genotype Ps1, whereas genotype As was more prevalent in the noble crayfish areas. Interestingly, a difference was seen in the outcome of the infection. The Ps1 infection was always associated with acute mortalities, while As infections were also frequently found in existing but weak populations. The persistent nature of an As infection could be verified in noble crayfish from a small lake in southern Finland. This finding explained why many of the efforts to introduce a new noble crayfish population into a water body after a crayfish plague induced mortality were futile. The main conclusion from the field study data of this research was the difference in virulence between the Ps1 and As genotype strains. This was also verified in a challenge trial with noble crayfish. While the Ps1 strains did not show much variation in their growth behaviour or virulence, there was much more variation in the As strains. The As genotype arrived in Finland more than 100 years ago, and since that date it seems to have adapted to the novel host, the noble crayfish, to some extent. In order to gain insight into a possible vector of this genotype, we studied another North American crayfish species present in Europe, the spiny-cheek crayfish Orconectes limosus from a Czech pond. This crayfish species appeared to carry a novel genotype of A. astaci, named Orconectes genotype, designated Or . It seems possible that many of the North American crayfish species carry their own type of crayfish plague agent, with variable features such as virulence. These differences should be further tested in the future. The results of this study alleviate the necessity to study the noble crayfish mortalities for the verification of crayfish plague, including the study for the genotype of the A. astaci strain. Crayfish fisheries and conservation management decisions should not be made without a prior control of the donating population and the receiving water body for the eventual presence of a low-virulent A. astaci.
  • Kaartinen, Johanna (Helsingin yliopisto, 2016)
    The alpha-2 agonist, medetomidine (MED), and its pure active enantiomer dexmedetomidine (DMED), are in clinical use for small animal practice as potent sedatives, analgesic agents, muscle relaxants, and as an adjunct agents for balanced anaesthesia. However, their cardiovascular effects limit their use. The use of constant rate infusion (CRI) for administration of MED was studied in order to provide the sedation and analgesia while decreasing the cardiovascular adverse effects. The second avenue of investigation performed was addition of the peripheral alpha-2 antagonist, MK-467, to limit the haemodynamic effects of MED. The cardiovascular effects of MED CRI were investigated in a dose finding study. Six dose levels were administered during general anaesthesia from very low dose until a high, positive control dose in order to quantify dose-dependency. In order to elucidate an appropriate agonist-antagonist dose ratio a step-down infusion protocol of MED and addition of step-up infusion protocol of MK-467 in anaesthetised dogs was performed. The effects and interaction of both drugs during absorption, and distribution phase were studied during an intramuscular study protocol using a co-administration of both drugs, where three doses of MK-467 were investigated. Plasma concentration measurements provided pharmacokinetic parameter estimates. MED-CRI administration showed promising results, in demonstrating the dose-dependency of the cardiovascular effects. With the low doses of MED CRI, the adverse effects may be minimised, although not completely avoided. A complex pharmacokinetic and pharmacodynamic interaction in between the two molecules was revealed; after intramuscular (IM) co-administration of both drugs the absorption of MED was accelerated by the addition of MK-467 to the treatment. The step infusions revealed that MK-467 also increased the elimination of MED. The optimal dose ratio finding is complicated as the context in which the drugs are given (IM, IV, CRI, under general anaesthesia etc.) affects the disposition of MED. The combined pharmacokinetic and dynamic results provide good initial pharmacokinetic estimates for the future PK-PD modelling to predict the interaction of these two drugs, MED and MK-467, in dogs.
  • Tähkäpää, Satu (Helsingin yliopisto, 2016)
    After a series of food incidents in the 1990s, the food business sector has become one of the most heavily regulated sectors in the European Union (EU), with ever-evolving regulations regarding both official food control and food business operators (FBOs). The regulatory framework is meaningless if the regulation is not implemented promptly according to transitional provisions and in a unified way. Dissenting implementation of food safety legislation may also endanger the equal treatment of FBOs and the principle of free trade of foodstuff in the EU. This research provides a new perspective on the challenges of implementing food legislation, with the phenomenon surveyed from the viewpoints of both control officials and FBOs. Resources and organization of food control affect actual control work and were thus included. Varying ways of reporting and handling food frauds on the local, national and EU levels were also investigated. Fulfilling food control requirements set in food legislation necessitates an adequate quantity and quality of personnel, whereas organization of food control can differ between countries or areas depending on socio-economic and political factors. In Finland, municipalities alone or as a joint control unit are responsible for local food control in their respective areas. According to the results, this may lead to varying implementation and interpretation of food legislation, endangering equal treatment of FBOs. There is an alarming shortage of food control personnel in some regions in Finland. Even when food safety is the responsibility of FBOs, scarce resources in food control result in a lower percentage of approved in-house control systems among FBOs. This research revealed a connection between the number of approved in-house control systems and the number of reported food- or waterborne outbreaks in the area, especially in regions with inadequate food control resources. EU legislation concerning quality systems, food control plans and food control fees are implemented in Finland at regionally different time points and with different contents directly influencing FBOs in regionally variable ways. Control officials support larger control units, with the rationale that they will increase equal treatment of FBOs. Both control officials and FBOs have problems in implementation of food legislation, and FBOs are also challenged with varying interpretation of legislation and requirements of control officials. As food safety is the responsibility of the FBOs, they need to understand and carefully comply with legislation. The challenges of fish and meat FBOs in implementation of legislation were therefore evaluated. According to this study, the most common problems concerning food safety legislation are related to layout of production premises and transport routes, control fees, requirements concerning in-house control and structures and maintenance of premises. Risk evaluation is problematic for both control officers and FBOs. Traditional food control measures are challenged, when requirements set by law are intentionally violated for financial gain by FBOs, with food deliberately placed on the market with the intention of deceiving the consumer (food fraud). Uniform methods to detect and report food fraud are needed. Hence patterns of food frauds published in the EU Rapid Alert System for Food and Feed (RASFF) in 2008-2012, recalls of notifications published by the Finnish Food Safety Authority Evira in 2008-2012 and local Finnish food fraud cases in 2003-2012 were analysed. Patterns of food fraud and manners of reporting frauds at the local, national and EU levels differ significantly. If the detection and reporting of frauds and the legal consequences incurred by FBOs for frauds differ among member states, it may create distortion of competition.
  • Läikkö-Roto, Tiina (Helsingin yliopisto, 2016)
    The primary legal responsibility for ensuring food safety in the European Union lies with food business operators. However, official controls shall also be implemented to ensure that food handling complies with the relevant requirements. Level of food safety is thus affected by several factors: the appropriateness of legislation in order to achieve food safety, the compliance of food businesses with legislative demands, and the efficacy of official food controls in verifying and enforcing compliance. The main objective of this work was to examine the factors behind efficacious local official food controls and the possibilities for improving the efficacy of the controls at different levels of the food control chain in Finland. The second objective was to investigate the consistency and quality of the local official food controls and ways to enhance these. To achieve these aims, studies were conducted on four different levels of the food control chain, i.e. level of food business operators, level of official inspecting staff, level of management for the official inspecting staff, and level of auditing of official food controls. Businesses that both prepared and served foods (in this work restaurants or restaurant business operators ) were chosen as representatives of food businesses. Positive correlations were found between the hygiene knowledge of restaurant business operators, their attitudes towards official food control, and the hygiene level of their operations. Proper justification of control measures used by food control officials, provision of guidance, and a negotiative approach in tasks of official food controls appear to be highly important for improving hygiene in food establishments. Several factors related to the food control official and the working unit of the official may affect the inspection processes and the efficacy of controls. The use of checklists and templates for inspection reports were noted to enhance the consistency and efficacy of controls. The templates also reduced the time used in preparing inspection reports and increased the quality of these reports. Time limits for correcting non-compliances had a significant impact on the efficacy of controls. Food control units had created adequate working conditions by providing their staff with guidance papers, templates, and possibilities to collectively hold discussions. However, poor orientation of new staff, non-systematic utilization of tacit knowledge through converting it to explicit knowledge and sharing it, and incomplete commitment among staff to quality systems remain challenges in the units. Insufficient human resources and the inability of heads of food control units to recognize problems in the workplace setting may impair the functional capacity of units. Poor workplace atmosphere and weaknesses in organization of work may be reflected in lesser appreciation of food businesses operators towards official food controls. Perceptions of the auditors (regional officials) and of the auditees (municipal officials) differed greatly regarding the adequacy of the auditing system. The regional officials had experienced the auditing visits as clearly more useful and positive than the municipal officials and also found the current auditing system to be more suitable for the purpose. The regional officials did, however, state that the auditing results had not been adequately utilized in planning the guidance and education of professionals working in official food control. Based on the results of this work, certain weaknesses exist in the efficacy and consistency of local official food controls in Finland. However, several means to improve the efficacy and consistency of the controls were identified on all studied levels of the food control chain. Some of the observed impact possibilities, such as using checklists during inspections and using templates for inspection reports, are relatively simple to implement. Other measures, such as fully implementing risk-based procedures during inspections and more systematic utilization of the tacit knowledge that is present among the official food control staff, would require a substantial amount of time and effort of the food control authorities.
  • Palander, Pälvi Avilla (Helsingin yliopisto, 2016)
    Tail biting in pigs is a phenomenon which has been known since the 19th century. The prevalence of tail biting damage is estimated to be between 3 10 % of pigs slaughtered, being somewhat higher in non-docked than tail-docked populations. Being tail bitten causes pain and a state of stress to the victim pigs, as well as general disturbance in pig groups. The aim of this thesis was to improve animal welfare by generating new knowledge for resolving the multifactorial mystery of tail biting behaviour. The precise etiology of tail biting behaviour is unknown, although many risk factors have been identified. The most supported theory of tail biting involves lack of opportunities for pigs to fulfil their innate need for exploration and foraging behaviours in modern production environments. Both these behaviours are mainly performed orally. Deficiencies in the nutritional state of the animal can increase the motivation for foraging behaviour. The three studies presented within this thesis used a case-control design, with questionnaires to investigate feeding and diet related, and environmental risk factors for tail biting damage at a population level, and animal studies to search for differences in intestinal cell wall structure, nutritional state, blood metabolites and different brain area concentrations of serotonin and dopamine and their metabolites between tail-biter pigs, victim pigs, control pigs in tail biting pens and control pigs in non-biting pens at the individual level. Results from the population level study identified environment-, feeding- and management-related risk factors that were similar to those reported in earlier published epidemiological studies of both long-tailed and tail-docked pigs. Interactions between different types of risk factors were depicted. Both the etiologies of tail biting which originates from exploratory and foraging behaviour can be supported by the risk factors found here. The similarities between the risk factors for tail biting and gastric ulceration needs more attention in future studies. Tail biting pigs were found to have elevated serotonin metabolism in the prefrontal cortex (PFC). The serotonin precursor, tryptophan, and its ratio to other amino acids in blood, correlated with serotonin metabolism in the PFC. Experience of stress is suggested to explain the findings in the brain. The influence of enhanced serotonin turn-over on amino acid metabolism needs further study. Tail bitten pigs were found to have decreased concentration of several non-essential amino acids in blood, and a higher ratio of metabolites of serotonin and dopamine in the striatum and in the limbic cortex. These findings are possibly associated with the experience of stress from being bitten, and a change in protein metabolism arising from the acute phase reaction. In a pen where tail biting was present, it was not only the biters and bitten pigs that were affected by this, but also the control pigs. Control pigs were found to have reduced intestinal villus height and changes in blood nutrient concentrations that indicate short term malnutrition, possibly resulting from changes in feed intake.
  • Kilpinen, Susanne (Helsingin yliopisto, 2016)
    The term tylosin-responsive diarrhea (TRD) stands for canine chronic, idiopathic, recurrent diarrhea that responds repeatedly to tylosin treatment. A specific feature of TRD is that diarrhea ceases within a few days of initiating tylosin treatment and the stool remains normal for as long as treatment continues. After discontinuation of tylosin, diarrhea reappears in many dogs within a short time. When tylosin is reintroduced, its effect does not diminish, even with numerous treatments. Despite its wide acceptance as a treatment for canine chronic enteropathies, few published studies have assessed the efficacy of tylosin treatment in chronic diarrhea disorders in dogs. To evaluate the efficacy, a prospective, placebo-controlled, double-blinded, randomized clinical trial was performed. The proportion of dogs with normal fecal consistency at the end of treatment was 85% in the tylosin group and 29% in the placebo group. Tylosin at a dosage of 25 mg/kg once daily for seven days was effective, compared with placebo, in treating diarrhea in TRD dogs. No changes specific to TRD were detected in clinical, laboratory, or histopathologic examinations. In this study, TRD had affected mainly middle-aged, large-breed dogs, and the diarrhea was of a mixed small and large intestinal pattern. Diarrhea recurred in 88% of the diagnostically confirmed TRD dogs at a median of day 9 after discontinuation of tylosin. Diarrhea ceased at a median of 2-3 days after reintroduction of tylosin at three different dosages. Tylosin lacks official oral dosage recommendations and an optimal treatment strategy has not been established. The results showed that 93% of the dogs included in the trial, suffering from recurrent diarrhea and previously responding to oral tylosin therapy at a dose of 25 mg/kg once daily for seven days, responded also to doses of 5 and 15 mg/kg once daily for seven days after diarrhea relapse. The exact mode of action contributing to the cessation of recurrent diarrhea in dogs after tylosin administration remains obscure. Known pathogenic bacteria have previously been excluded as the underlying cause of TRD. Therefore, the antibacterial effect of tylosin may play a minor role in terminating recurrent diarrhea, and the immunomodulatory properties of tylosin could explain the favorable effect. These could be mediated by a shift of intestinal microbiota towards potentially beneficial commensal bacteria. Data revealed that tylosin administration resulted at the time of cessation of diarrhea in a significant increase in the fecal levels of Enterococcus spp. of TRD dogs compared with levels during the diarrhea period following discontinuation of tylosin. Cessation of diarrhea in TRD dogs with tylosin treatment could thus be mediated by selection of a specific intestinal lactic acid population, the Enterococcus spp.. To conclude, this thesis provides evidence-based data that tylosin is an effective treatment of chronic, idiopathic, recurrent diarrhea in dogs. The results suggest that an optimal treatment strategy for TRD is to start an empirical treatment trial with tylosin at a dosage of 25 mg/kg once daily for seven days. In case of a relapse after discontinuation of tylosin, the dose could be tapered down to 5 mg/kg once daily at the time of reintroducing tylosin. Further, the findings provide new insight into the possible mechanism behind the cessation of chronic diarrhea. The probiotic potential of some enterococci strains is known and these enterococci may possess properties that attenuate inflammation in the gut mucosa and normalize fecal consistency. They may thus provide an alternative to the long-term use of antimicrobials in the treatment of TRD in dogs.