Browsing by Subject "OPTIMIZATION"

Sort by: Order: Results:

Now showing items 1-20 of 39
  • Hakkarainen, Janne; Solonen, Antti; Ilin, Alexander; Susiluoto, Jouni; Laine, Marko; Haario, Heikki; Järvinen, Heikki (2013)
  • Herodotou, Herodotos; Chen, Yuxing; Lu, Jiaheng (2020)
    Big data processing systems (e.g., Hadoop, Spark, Storm) contain a vast number of configuration parameters controlling parallelism, I/O behavior, memory settings, and compression. Improper parameter settings can cause significant performance degradation and stability issues. However, regular users and even expert administrators grapple with understanding and tuning them to achieve good performance. We investigate existing approaches on parameter tuning for both batch and stream data processing systems and classify them into six categories: rule-based, cost modeling, simulation-based, experiment-driven, machine learning, and adaptive tuning. We summarize the pros and cons of each approach and raise some open research problems for automatic parameter tuning.
  • Richardson, Dominique; Itkonen, Jaakko; Nievas, Julia; Urtti, Arto; Casteleijn, Marco G. (2018)
    The use of living cells for the synthesis of pharmaceutical proteins, though state-of-the-art, is hindered by its lengthy process comprising of many steps that may affect the protein’s stability and activity. We aimed to integrate protein expression, purification, and bioconjugation in small volumes coupled with cell free protein synthesis for the target protein, ciliary neurotrophic factor. Split-intein mediated capture by use of capture peptides onto a solid surface was efficient at 89–93%. Proof-of-principle of light triggered release was compared to affinity chromatography (His6 fusion tag coupled with Ni-NTA). The latter was more efficient, but more time consuming. Light triggered release was clearly demonstrated. Moreover, we transferred biotin from the capture peptide to the target protein without further purification steps. Finally, the target protein was released in a buffer-volume and composition of our choice, omitting the need for protein concentration or changing the buffer. Split-intein mediated capture, protein trans splicing followed by light triggered release, and bioconjugation for proteins synthesized in cell free systems might be performed in an integrated workflow resulting in the fast production of the target protein.
  • Xu, Yongjun; Liu, Xin; Cao, Xin; Huang, Changping; Liu, Enke; Qian, Sen; Liu, Xingchen; Wu, Yanjun; Dong, Fengliang; Qiu, Cheng-Wei; Qiu, Junjun; Hua, Keqin; Su, Wentao; Wu, Jian; Xu, Huiyu; Han, Yong; Fu, Chenguang; Yin, Zhigang; Liu, Miao; Roepman, Ronald; Dietmann, Sabine; Virta, Marko; Kengara, Fredrick; Zhang, Ze; Zhang, Lifu; Zhao, Taolan; Dai, Ji; Yang, Jialiang; Lan, Liang; Luo, Ming; Liu, Zhaofeng; An, Tao; Zhang, Bin; He, Xiao; Cong, Shan; Liu, Xiaohong; Zhang, Wei; Lewis, James P.; Tiedje, James M.; Wang, Qi; An, Zhulin; Wang, Fei; Zhang, Libo; Huang, Tao; Lu, Chuan; Cai, Zhipeng; Wang, Fang; Zhang, Jiabao (2021)
    Y Artificial intelligence (AI) coupled with promising machine learning (ML) techniques well known from computer science is broadly affecting many aspects of various fields including science and technology, industry, and even our day-to-day life. The ML techniques have been developed to analyze high-throughput data with a view to obtaining useful insights, categorizing, predicting, and making evidence-based decisions in novel ways, which will promote the growth of novel applications and fuel the sustainable booming of AI. This paper undertakes a comprehensive survey on the development and application of AI in different aspects of fundamental sciences, including information science, mathematics, medical science, materials science, geoscience, life science, physics, and chemistry. The challenges that each discipline of science meets, and the potentials of AI techniques to handle these challenges, are discussed in detail. Moreover, we shed light on new research trends entailing the integration of AI into each scientific discipline. The aim of this paper is to provide a broad research guideline on fundamental sciences with potential infusion of AI, to help motivate researchers to deeply understand the state-of-the-art applications of AI-based fundamental sciences, and thereby to help promote the continuous development of these fundamental sciences.
  • Hasan, Galib; Salo, Vili-Taneli; Valiev, Rashid; Kubecka, Jakub; Kurten, Theo (2020)
    Organic peroxy radicals (RO2) are key intermediates in the chemistry of the atmosphere. One of the main sink reactions of RO2 is the recombination reaction RO2 + R'O-2, which has three main channels (all with O-2 as a coproduct): (1) R-H=O + R'OH, (2) RO + R'O, and (3) ROOR'. The RO + R'O "alkoxy" channel promotes radical and oxidant recycling, while the ROOR' "dimer" channel leads to low-volatility products relevant to aerosol processes. The ROOR' channel has only recently been discovered to play a role in the gas phase. Recent computational studies indicate that all of these channels first go through an intermediate complex( 1)(RO center dot center dot center dot O-3(2)center dot center dot center dot OR'). Here, O-3(2) is very weakly bound and will likely evaporate from the system, giving a triplet cluster of two alkoxy radicals: (3)(RO center dot center dot center dot OR'). In this study, we systematically investigate the three reaction channels for an atmospherically representative set of RO + R'O radicals formed in the corresponding RO2+ R'O-2 reaction. First, we systematically sample the possible conformations of the RO center dot center dot center dot OR' clusters on the triplet potential energy surface. Next, we compute energetic parameters and attempt to estimate reaction rate coefficients for the three channels: evaporation/dissociation to RO + R'O, a hydrogen shift leading to the formation of R'(-H)=O + ROH, and "spin-flip" (intersystem crossing) leading to, or at least allowing, the formation of ROOR' dimers. While large uncertainties in the computed energetics prevent a quantitative comparison of reaction rates, all three channels were found to be very fast (with typical rates greater than 10 6 s(-1)). This qualitatively demonstrates that the computationally proposed novel RO2 + R'O-2 reaction mechanism is compatible with experimental data showing non-negligible branching ratios for all three channels, at least for sufficiently complex RO2.
  • Xue, Hailian; Mäkelä, Aino Annikki; Valsta, Lauri Tapani; Vanclay, Jerome; Cao, Tianjian (2019)
    Stand management optimization has long been computationally demanding as increasingly detailed growth and yield models have been developed. Process-based growth models are useful tools for predicting forest dynamics. However, the difficulty of classic optimization algorithms limited its applications in forest planning. This study assessed alternative approaches to optimizing thinning regimes and rotation length using a process-based growth model. We considered (1) population-based algorithms proposed for stand management optimization, including differential evolution (DE), particle swarm optimization (PSO), evolution strategy (ES), and (2) derivative-free search algorithms, including the Nelder–Mead method (NM) and Osyczka’s direct and random search algorithm (DRS). We incorporated population-based algorithms into the simulation-optimization system OptiFor in which the process-based model PipeQual was the simulator. The results showed that DE was the most reliable algorithm among those tested. Meanwhile, DRS was also an effective algorithm for sparse stands with fewer decision variables. PSO resulted in some higher objective function values, however, the computational time of PSO was the longest. In general, of the population-based algorithms, DE is superior to the competing ones. The effectiveness of DE for stand management optimization is promising and manifested.
  • Ruuth, Riikka; Kuusela, Linda; Mäkelä, Teemu; Melkas, Susanna; Korvenoja, Antti (2019)
    Aim and scope: A Gradient Echo Plural Contrast Imaging technique (GEPCI) is a post-processing method, which can be used to obtain quantitative T2* values and generate multiple synthetic contrasts from a single acquisition. However, scan duration and image reconstruction from k-space data present challenges in a clinical workflow. This study aimed at optimizing image reconstruction and acquisition duration to facilitate a post-processing method for synthetic image contrast creation in clinical settings. Materials and methods: This study consists of tests using the American College of Radiology (ACR) image quality phantom, two healthy volunteers, four mild traumatic brain injury patients and four small vessel disease patients. The measurements were carried out on a 3.0 T scanner with multiple echo times. Reconstruction from k-space data and DICOM data with two different coil-channel combination modes were investigated. Partial Fourier techniques were tested to optimize the scanning time. Conclusions: Sum of squares coil-channel combination produced artifacts in phase images, but images created with adaptive combination were artifact-free. The voxel-wise median signed difference of T2* between the vendor's adaptive channel combination and k-space reconstruction modes was 2.9 +/- 0.7 ms for white matter and 4.5 +/- 0.6 ms for gray matter. Relative white matter/gray matter contrast of all synthetic images and contrast-to-noise ratio of synthetic T1-weighted images were almost equal between reconstruction modes. Our results indicate that synthetic contrasts can be generated from the vendor's DICOM data with the adaptive combination mode without affecting the quantitative T2* values or white matter/gray matter contrast.
  • Mäntylä, Teemu; Kieseppä, Tuula; Suvisaari, Jaana; Raij, Tuukka T. (2021)
    Poor insight is a central characteristic of psychotic disorders, and it has been suggested to result from a general dysfunction in self-reflection. However, brain processing of clinical insight and more general self-reflection has not been directly compared. We compared tasks on (1) self-reflection on psychosis-related mental functioning (clinical insight, in patients only), (2) self-reflection on mental functioning unrelated to psychosis (general metacognition), and (3) semantic control during blood-oxygenation-level-dependent (BOLD) functional magnetic resonance imaging with 19 first-episode psychosis patients and 24 control participants. Arterial-spin-labeling (ASL) images were collected at rest. Clinical insight was evaluated with the Schedule for the Assessment of Insight. In patients, posterosuperior precuneus showed stronger activation during the insight task than during the semantic control task, while anteroinferior precuneus and posterior cingulate cortex (PCC) showed stronger activation during the insight task than during the general metacognition task. No significant group differences in brain activation emerged during the general metacognition task. Although the BOLD measures did not correlate with clinical insight measures, ASL-measured cerebral blood flow (CBF) values did correlate when extracted from the task-selective precuneus/PCC areas: higher CBF correlated with higher clinical insight scores. These results suggest that regions in the posteromedial cortex are selective for clinical insight.
  • Peltonen, Leena (2018)
    Drug nanocrystals are nanosized solid drug particles, the most important application of which is the improvement of solubility properties of poorly soluble drug materials. Drug nanocrystals can be produced by many different techniques, but the mostly used are different kinds of media milling techniques; in milling, particle size of bulk sized drug material is decreased, with the aid of milling beads, to nanometer scale. Utilization of Quality by Design, QbD, approach in nanomilling improves the process-understanding of the system, and recently, the number of studies using the QbD approach in nanomilling has increased. In the QbD approach, the quality is built into the products and processes throughout the whole production chain. Definition of Critical Quality Attributes, CQAs, determines the targeted final product properties. CQAs are confirmed by setting Critical Process Parameters, CPPs, which include both process parameters but also input variables, like stabilizer amount or the solid state form of the drug. Finally, Design Space determines the limits in which CPPs should be in order to reach CQAs. This review discusses the milling process and process variables, CPPs, their impact on product properties, CQAs and challenges of the QbD approach in nanomilling studies.
  • Mielonen, Outi I.; Pratas, Diogo; Hedman, Klaus; Sajantila, Antti; Perdomo , Maria (2022)
    Formalin fixation, albeit an outstanding method for morphological and molecular preservation, induces DNA damage and cross-linking, which can hinder nucleic acid screening. This is of particular concern in the detection of low-abundance targets, such as persistent DNA viruses. In the present study, we evaluated the analytical sensitivity of viral detection in lung, liver, and kidney specimens from four deceased individuals. The samples were either frozen or incubated in formalin (+/- paraffin embedding) for up to 10 days. We tested two DNA extraction protocols for the control of efficient yields and viral detections. We used short-amplicon qPCRs (63-159 nucleotides) to detect 11 DNA viruses, as well as hybridization capture of these plus 27 additional ones, followed by deep sequencing. We observed marginally higher ratios of amplifiable DNA and scantly higher viral genoprevalences in the samples extracted with the FFPE dedicated protocol. Based on the findings in the frozen samples, most viruses were detected regardless of the extended fixation times. False-negative calls, particularly by qPCR, correlated with low levels of viral DNA (150 base pairs). Our data suggest that low-copy viral DNAs can be satisfactorily investigated from FFPE specimens, and encourages further examination of historical materials.
  • Ollinaho, Pirkka; Carver, Glenn D.; Lang, Simon T. K.; Tuppi, Lauri; Ekblom, Madeleine; Järvinen, Heikki (2021)
    Ensemble prediction is an indispensable tool in modern numerical weather prediction (NWP). Due to its complex data flow, global medium-range ensemble prediction has almost exclusively been carried out by operational weather agencies to date. Thus, it has been very hard for academia to contribute to this important branch of NWP research using realistic weather models. In order to open ensemble prediction research up to the wider research community, we have recreated all 50 + 1 operational IFS ensemble initial states for OpenIFS CY43R3. The dataset (Open Ensemble 1.0) is available for use under a Creative Commons licence and is downloadable from an https server. The dataset covers 1 year (December 2016 to November 2017) twice daily. Downloads in three model resolutions (T(L)159, T(L)399, and T(L)639) are available to cover different research needs. An open-source workflow manager, called OpenEPS, is presented here and used to launch ensemble forecast experiments from the perturbed initial conditions. The deterministic and probabilistic forecast skill of OpenIFS (cycle 40R1) using this new set of initial states is comprehensively evaluated. In addition, we present a case study of Typhoon Damrey from year 2017 to illustrate the new potential of being able to run ensemble forecasts outside of major global weather forecasting centres.
  • Euclid Collaboration; Knabenhans, M.; Stadel, J.; Gozaliasl, G.; Keihänen, E.; Kirkpatrick , C. C.; Kurki-Suonio, H.; Väliviita, J. (2021)
    We present a new, updated version of the EuclidEmulator (called EuclidEmulator2), a fast and accurate predictor for the nonlinear correction of the matter power spectrum. 2 per cent level accurate emulation is now supported in the eight-dimensional parameter space of w(0)w(a)CDM+Sigma m(nu) models between redshift z = 0 and z = 3 for spatial scales within the range . In order to achieve this level of accuracy, we have had to improve the quality of the underlying N-body simulations used as training data: (i) we use self-consistent linear evolution of non-dark matter species such as massive neutrinos, photons, dark energy, and the metric field, (ii) we perform the simulations in the so-called N-body gauge, which allows one to interpret the results in the framework of general relativity, (iii) we run over 250 high-resolution simulations with 3000(3) particles in boxes of 1(h(-1)Gpc)(3) volumes based on paired-and-fixed initial conditions, and (iv) we provide a resolution correction that can be applied to emulated results as a post-processing step in order to drastically reduce systematic biases on small scales due to residual resolution effects in the simulations. We find that the inclusion of the dynamical dark energy parameter w(a) significantly increases the complexity and expense of creating the emulator. The high fidelity of EuclidEmulator2 is tested in various comparisons against N-body simulations as well as alternative fast predictors such as HALOFIT, HMCode, and CosmicEmu. A blind test is successfully performed against the Euclid Flagship v2.0 simulation. Nonlinear correction factors emulated with EuclidEmulator2 are accurate at the level of or better for and z
  • Sanz, Dafne Jacome; Raivola, Juuli; Karvonen, Hanna; Arjama, Mariliina; Barker, Harlan; Murumägi, Astrid; Ungureanu, Daniela (2021)
    Simple Summary Ovarian cancer (OC) is known for its poor prognosis, due to the absence of reliable biomarkers and its late diagnosis, since the early-stage disease is almost asymptomatic. Lipid metabolism plays an important role in OC progression due to the development of omental metastasis in the abdominal cavity. The aim of our study was to assess the therapeutic role of various enzymes involved in lipid metabolism regulation or synthesis, in different subtypes of OC represented by cell lines as well as patient-derived cancer cell cultures (PDCs). We show that proprotein convertase subtilisin/kexin type 9 (PCSK9), a cholesterol-regulating enzyme, plays a pro-survival role in OC and targeting its expression impairs cancer cell growth. We also tested a small library of metabolic and mTOR-targeting drugs to identify drug vulnerabilities specific to various subtypes of OC. Our results show that in OC cell lines and PDCs the second generation of mTOR inhibitors such as AZD8055, vistusertib, dactolisib and sapanisertib, have higher cytotoxic activity compared to the first generation mTOR inhibitors such as rapalogs. These results suggest that, in the era of precision medicine, it is possible to target the metabolic pathway in OC and identify subtype-specific drug vulnerabilities that could be advanced to the clinic. Background: Dysregulated lipid metabolism is emerging as a hallmark in several malignancies, including ovarian cancer (OC). Specifically, metastatic OC is highly dependent on lipid-rich omentum. We aimed to investigate the therapeutic value of targeting lipid metabolism in OC. For this purpose, we studied the role of PCSK9, a cholesterol-regulating enzyme, in OC cell survival and its downstream signaling. We also investigated the cytotoxic efficacy of a small library of metabolic (n = 11) and mTOR (n = 10) inhibitors using OC cell lines (n = 8) and ex vivo patient-derived cell cultures (PDCs, n = 5) to identify clinically suitable drug vulnerabilities. Targeting PCSK9 expression with siRNA or PCSK9 specific inhibitor (PF-06446846) impaired OC cell survival. In addition, overexpression of PCSK9 induced robust AKT phosphorylation along with increased expression of ERK1/2 and MEK1/2, suggesting a pro-survival role of PCSK9 in OC cells. Moreover, our drug testing revealed marked differences in cytotoxic responses to drugs targeting metabolic pathways of high-grade serous ovarian cancer (HGSOC) and low-grade serous ovarian cancer (LGSOC) PDCs. Our results show that targeting PCSK9 expression could impair OC cell survival, which warrants further investigation to address the dependency of this cancer on lipogenesis and omental metastasis. Moreover, the differences in metabolic gene expression and drug responses of OC PDCs indicate the existence of a metabolic heterogeneity within OC subtypes, which should be further explored for therapeutic improvements.
  • Känkänen, Voitto; Seitsonen, Jani; Tuovinen, Henri Mikael; Ruokolainen, Janne; Hirvonen, Jouni; Balasubramanian, Vimalkumar; Santos, Hélder A. (2020)
    Nanoprecipitation is a straightforward method for the production of block copolymer nanoparticles for drug delivery applications. However, the effects of process parameters need to be understood to optimize and control the particle size distribution (PSD). To this end, we investigated the effects of material and process factors on PSD and morphology of nanoparticles prepared from an amphiphilic diblock copolymer, poly(ethylene oxide)-block-polycaprolactone. Using a Design of Experiments approach, we explored the joint effects of molecular weight, block length ratios, water volume fraction, stirring rate, polymer concentration and organic phase addition rate on hydrodynamic size and polydispersity index of the nanostructures and created statistical models explaining up to 94 % of the variance in hydrodynamic diameter. In addition, we performed morphological characterization by cryogenic transmission electron microscopy and showed that increasing the process temperature may favor the formation of vesicles from these polymers. We showed that the effects of process parameters are dependent on the polymer configuration and we found that the most useful parameters to fine-tune the PSD are the initial polymer concentration and the stirring rate. Overall, this study provides evidence on the joint effects of material and process parameters on PSD and morphology, which will be useful for rational design of formulation-specific optimization studies, scale-up and process controls.
  • Rosa-Sibakov, Natalia; Sorsamäki, Lotta; Immonen, Mikko; Nihtilä, Hanna; Maina, Ndegwa; Siika-aho, Matti; Katina, Kati; Nordlund, Emilia (2022)
    Food-grade enzymes (alpha-amylase, amyloglucosidase, maltogenic amylase, and protease) were investigated for recycling waste bread back to wheat bread-making process. Waste bread was efficiently hydrolyzed into sugars (up to 93% glucose yield) and the best combination of enzymes was alpha-amylase (0.05 g/kg bread) and amyloglucosidase (2.5 g/kg bread). Selected enzyme hydrolysis processes were tested in wheat bread making as a (a) hydrolyzed slurry, that is hydrolyzed waste bread without solid/liquid separation and (b) syrup, that is liquid supernatant after centrifugation of the hydrolyzed waste bread. Both hydrolyzed bread slurry and syrup were successfully utilized to replace sucrose (2 and 4%) in bread making without affecting the bread quality when compared to the control bread. Techno-economic assessment revealed that this approach is 12% more economical than the current mean to dispose the bakery waste. This recycling concept showed both technical and economic potential for bakery industries to overcome their excess bread production. Novelty impact statement A new recycling process was developed by using enzymes to efficiently hydrolyze surplus bread into sugars. The use of those sugar-rich slurries and syrups in bread rework did not affect the bread quality when compared to the control bread. The recycling concept was more economical than current means to dispose waste bread, revealing the technical and economic potential for bakery industries to overcome their excess bread production.
  • Zanca, Tommaso; Kubecka, Jakub; Zapadinsky, Evgeni; Passananti, Monica; Kurten, Theo; Vehkamäki, Hanna (2020)
    Identification of atmospheric molecular clusters and measurement of their concentrations by atmospheric pressure interface time-of-flight (APi-TOF) mass spectrometers may be affected by systematic error due to possible decomposition of clusters inside the instrument. Here, we perform numerical simulations of decomposition in an APi-TOF mass spectrometers and formation in the atmosphere of a set of clusters which involve a representative kind of highly oxygenated organic molecule (HOM), with the molecular formula C10H16O8. This elemental composition corresponds to one of the most common mass peaks observed in experiments on ozone-initiated autoxidation of alpha-pinene. Our results show that decomposition is highly unlikely for the considered clusters, provided their bonding energy is large enough to allow formation in the atmosphere in the first place.
  • Lan, Hangzhen; Salmi, Leo D.; Rönkkö, Tuukka; Parshintsev, Jevgeni; Jussila, Matti; Hartonen, Kari; Kemell, Marianna; Riekkola, Marja-Liisa (2018)
    New chemical vapor reaction (CVR) and atomic layer deposition (ALD)-conversion methods were utilized for preparation of metal organic frameworks (MOFs) coatings of solid phase microextraction (SPME) Arrow for the first time. With simple, easy and convenient one-step reaction or conversion, four MOF coatings were made by suspend ALD iron oxide (Fe2O3) film or aluminum oxide (Al2O3) film above terephthalic acid (H2BDC) or trimesic acid (H3BTC) vapor. UIO-66 coating was made by zirconium (Zr)-BDC film in acetic acid vapor. As the first documented instance of all-gas phase synthesis of SPME Arrow coatings, preparation parameters including CVR/conversion time and temperature, acetic acid volume, and metal oxide film/metal-ligand films thickness were investigated. The optimal coatings exhibited crystalline structures, excellent uniformity, satisfactory thickness (2-7.5 μm), and high robustness (>80 times usage). To study the practical usefulness of the coatings for the extraction, several analytes with different chemical properties were tested. The Fe-BDC coating was found to be the most selective and sensitive for the determination of benzene ring contained compounds due to its highly hydrophobic surface and unsaturated metal site. UIO-66 coating was best for small polar, aromatic, and long chain polar compounds owing to its high porosity. The usefulness of new coatings were evaluated for gas chromatography-mass spectrometer (GC-MS) determination of several analytes, present in wastewater samples at three levels of concentration, and satisfactory results were achieved.
  • Pajula, Juha; Kauppi, Jukka-Pekka; Tohka, Jussi (2012)
  • Toivanen, Jussi; Meaney, Alexander; Siltanen, Samuli; Kolehmainen, Ville (2020)
    Multi-energy CT takes advantage of the non-linearly varying attenuation properties of elemental media with respect to energy, enabling more precise material identification than single-energy CT. The increased precision comes with the cost of a higher radiation dose. A straightforward way to lower the dose is to reduce the number of projections per energy, but this makes tomographic reconstruction more ill-posed. In this paper, we propose how this problem can be overcome with a combination of a regularization method that promotes structural similarity between images at different energies and a suitably selected low-dose data acquisition protocol using non-overlapping projections. The performance of various joint regularization models is assessed with both simulated and experimental data, using the novel low-dose data acquisition protocol. Three of the models are well-established, namely the joint total variation, the linear parallel level sets and the spectral smoothness promoting regularization models. Furthermore, one new joint regularization model is introduced for multi-energy CT: a regularization based on the structure function from the structural similarity index. The findings show that joint regularization outperforms individual channel-by-channel reconstruction. Furthermore, the proposed combination of joint reconstruction and non-overlapping projection geometry enables significant reduction of radiation dose.
  • Martino, L.; Elvira, V.; Luengo, D.; Corander, J. (2017)
    Monte Carlo methods represent the de facto standard for approximating complicated integrals involving multidimensional target distributions. In order to generate random realizations from the target distribution, Monte Carlo techniques use simpler proposal probability densities to draw candidate samples. The performance of any such method is strictly related to the specification of the proposal distribution, such that unfortunate choices easily wreak havoc on the resulting estimators. In this work, we introduce a layered (i.e., hierarchical) procedure to generate samples employed within a Monte Carlo scheme. This approach ensures that an appropriate equivalent proposal density is always obtained automatically (thus eliminating the risk of a catastrophic performance), although at the expense of a moderate increase in the complexity. Furthermore, we provide a general unified importance sampling (IS) framework, where multiple proposal densities are employed and several IS schemes are introduced by applying the so-called deterministic mixture approach. Finally, given these schemes, we also propose a novel class of adaptive importance samplers using a population of proposals, where the adaptation is driven by independent parallel or interacting Markov chain Monte Carlo (MCMC) chains. The resulting algorithms efficiently combine the benefits of both IS and MCMC methods.