Tag: Materials Science

Fine-tuning the theoretically predicted structure of MIL-47(V) with the aid of powder X-ray diffraction

Authors: Thomas Bogaerts, Louis Vanduyfhuys, Danny E. P. Vanpoucke, Jelle Wieme,
Michel Waroquier, Pascal van der Voort and Veronique van Speybroeck
Journal: Cryst. Eng. Comm. 17(45), 8612-8622 (2015)
doi: 10.1039/c5ce01388g
IF(2015): 3.849
export: bibtex
pdf: <Cryst.Eng.Comm.> 
Graphical Abstract: Which model represents the experimental XRD-spectra best? Ferromagnetic or anti-ferromagnetic chains? With of without offset?
Graphical Abstract: Which model represents the experimental XRD-spectra best? Ferromagnetic or anti-ferromagnetic chains? With of without offset?

Abstract

The structural characterization of complex crystalline materials such as metal organic frameworks can prove a very difficult challenge both for experimentalists as for theoreticians. From theory, the flat potential energy surface of these highly flexible structures often leads to different geometries that are energetically very close to each other. In this work a distinction between various computationally determined structures is made by comparing experimental and theoretically derived X-ray diffractograms which are produced from the materials geometry. The presented approach allows to choose the most appropriate geometry of a MIL-47(V) MOF and even distinguish between different electronic configurations that induce small structural changes. Moreover the techniques presented here are used to verify the applicability of a newly developed force field for this material. The discussed methodology is of significant importance for modelling studies where accurate geometries are crucial, such as mechanical properties and adsorption of guest molecules.

Phonons: shake those atoms

In physics, a phonon is a collective excitation in a periodic, elastic arrangement of atoms or molecules in condensed matter, like solids and some liquids. Often designated a quasi-particle, it represents an excited state in the quantum mechanical quantization of the modes of vibrations of elastic structures of interacting particles.

— source: wikipedia

Or for simplicity: sound waves; the ordered shaking of atoms or molecules. When you hit a metal bell with a (small) hammer, or make a wineglass sing by rubbing the edges, you experience the vibrations of the object as sound. All objects in nature (going from atoms to stars) can be made to vibrate, and they do this at one or more specific frequencies : their eigenfrequencies or normal frequencies.

Also single molecules, if they are hit (for example by another molecule bumping into them) or receive extra energy in another way, start to vibrate. These vibrations can take many forms (elongating and shortening of bonds, rotating of parts of the molecule with respect to other parts, flip-flopping of loose ends, and so forth) and give a unique signature to the molecule since each of these vibrations (so-called eigen-modes) corresponds with a certain energy given to the molecule. As a result, if you know all the eigen-modes of a molecule, you also know which frequencies of infrared light they should absorb, which is very useful, since in experiment we do not “see” molecules (if we see them at all) as nice ball-and-stick objects.

From the computational point of view, this is not the only reason why in molecular modeling the vibrational frequencies of a system (i.e. the above eigen-modes) are calculated. In addition, they also tell if a system is in its ground state (which is what one is looking for most of the time) or not. Although this tool has wide-spread usage in molecular modeling, it is seldom used in ab initio solid state physics because of the associated computational cost. In addition, because of the finite size of the unit cell, the reciprocal space in which phonons live also has a finite size, in contrast to the single point for a molecule…making life complex. 😎

Continue reading

Spring School Computational Tools: Day 5 – CP2K

Today was the fifth and last day of our spring school on computational tools for materials science. However, this was no reason to sit back and relax. After having been introduced into VASP (day-2) and ABINIT (day-3) for solids, and into Gaussian (day-4) for molecules, today’s code (CP2K) is one which allows you to study both when focusing on dynamics and solvation.

ensembles

If ensembles were coffee…

The introduction into the Swiss army knife called CP2K was provided by Dr. Andy Van Yperen-De Deyne. He explained to us the nature of the CP2K code (periodic, tools for solvated molecules, and focus on large/huge systems) and its limitations. In contrast to the codes of the previous days, CP2K uses a double basis set: plane waves where the properties are easiest and most accurate described with plane waves and gaussians where it is the case for gaussians. By means of some typical topics of calculations, Andy explained the basic setup of the input and output files, and warned for the explosive nature of too long time steps in molecular dynamics simulations. The possible ensembles for molecular dynamics (MD) were explained as different ways to store hot coffee. Following our daily routine, this session was followed by a hands-on session.

In the afternoon, the advanced session was presented by a triumvirate:  Thierry De Meyer, who discussed QM/MM simulations in detail, Dr. Andy Van Yperen-De Deyne, who discused vibrational finger printing and Lennart Joos, who, as the last presenter of the week, showed how different codes can be combined within a single project, each used where they are at the top of their strength, allowing him to unchain his zeolites.

CP2K, all lecturers

CP2K, all lecturers: Andy Van Yperen-De Deyne (top left), Thierry De Meyer (top right), Lennart Joos (bottom left). All spring school participants hard at work during the hands-on session, even at this last day (bottom right).

The spring school ended with a final hands-on session on CP2K, where the CMM team was present for the last stretch, answering questions and giving final pointers on how to perform simulations, and discussing which code to be most appropriate for each project. At 17h, after my closing remarks, the curtain fell over this spring school on computational tools for materials science. It has been a busy week, and Kurt and I are grateful for the help we got from everyone involved in this spring school, both local and external guests. Tired but happy I look back…and also a little bit forward, hoping and already partially planning a next edition…maybe in two years we will return.

Guests from the VASP (Martijn Marsman, top left) and ABINIT group (Xavier Gonze, top right, Matteo Giantomassi, bottom left, Gian-Marco Rignanese, bottom right)

Our external lecturers from the VASP group (Martijn Marsman, top left) and the ABINIT group (Xavier Gonze, top right, Matteo Giantomassi, bottom left, Gian-Marco Rignanese, bottom right)

Spring School Computational Tools: Day 4 – Gaussian

After having focused on solids during the previous two days of our spring school, using either VASP (Tuesday) or ABINIT (Wednesday), today’s focus goes to molecules, and we turn our attention to the Gaussian code.

Dietmar Hertsen introduced us into the Gaussian code, and immediately showed us why this code was included in our spring school set: it is the most popular code (according to google). He also explained this code is so popular (among chemists) because it can do a lot of the chemistry chemists are interested in, and because of the simplicity of the input files for small molecules. After pointing out the empty lines quirks of Gaussian it was time to introduce some of the possible editors to use with Gaussian. The remainder of his lecture, Dietmar showed us how simple (typical) Gaussian calculations are run, pointing out interesting aspects of the workflow, and the fun of watching vibrations in molden. He ended his lecture giving us some tips and tricks for the investigation of transition states, and the study of chemical reactions, as a mental preparation for the first hands-on session which followed after the coffee-break.

Lecturers for the Gaussian code.

Lecturers for the Gaussian code: Dietmar Hertsen (left) introducing the basics of the code, while Patrick Bultinck (right) discuses more advanced wave function techniques in more detail.

In the afternoon it was time to take out the big guns. Prof. Patrick Bultinck, of the Ghent Quantum Chemistry Group, was so kind to provide the advanced session. In this session, we were reminded, after two days of using the density as a central property, that wave functions are the only way to obtain perfect results. Unfortunately, practical limitations hamper the application to the systems of interest from a practical physical point of view. Patrick, being a quantum chemist to the bone, at several points stepped away from his slides and showed on the blackboard how several approximations to full configuration interaction (full-CI) can be obtained. He also made us aware of the caveats of such approaches; Such as size consistency and basis(-type) dependence for truncated CI, and noted that although CASSCF is a powerful method (albeit not for the fainthearted), it is somewhat a black art that should be used with caution. As such, CASSCF was not included in the advanced hands-on sessions guided by Dr. Sofie Van Damme (but who knows what may happen in a future edition of this spring school).

Spring School Computational Tools: Day 3 – Abinit

Today is the third day of our spring school. After the introduction to VASP yesterday, we now turn our attention to another quantum mechanical level solid state code: ABINIT. This is a Belgian ab initio code (mainly) developed at the Université Catholique de Louvain (UCL) in Wallonia.

Since no-one at CMM has practical experience with this code and several of us are interested to learn more about this code, we were very pleased that the group of Prof. Xavier Gonze was willing to support this day in its entirety. During the morning introductory session Prof. Xavier Gonze introduced us into the world of the ABINIT code, for which the initial ideas stem from 1997. Since then, the program has grown to about 800.000 lines of fortran code(!) and a team of 50 people worldwide currently contribute to its development. Also in recent years, a set of python scripts have been developed providing a more user friendly interface (abipy) toward the users of the code. The main goal of the development of this interface, is to shift interest back to the physics instead of trying to figure out which keywords do the trick. We also learned that the ABINIT code is strongly inspired by the ‘free software’ model, and as such Prof. Xavier Gonze prefers to refer to the copyright of ABINIT as copyleft. This open source mentality seems also to provide strength to the code; leading to its large number of developers/contributors; which in turn leads to the implementation of a wide variety of basis-sets, functionals and methodologies.

After the general introduction, Dr. Matteo Giantomassi introduced the abipy python package. This package was especially developed for automating post-processing of ABINIT results, and automatically generating input files. In short, to make interaction with ABINIT easier. Matteo, however, also warned that this approach which makes the use of ABINIT much more black box, might confuse beginners, since a lot of magic is going on under the hood of the abipy scripts. However, his presence, and that of the rest of the ABINIT delegation made sure confusion was kept to a minimum during the hands-on sessions, for which we are very grateful.

After the lunch break, Prof.  Gian-Marco Rignanese and Prof. Xavier Gonze held a duo seminar on more advanced topics covering Density Functional Perturbation Theory and based on this spectroscopy and phonon calculations beyond the frozen phonon approach. Of these last aspects I really am interested in seeing how they cope with my Metal Organic Frameworks

Spring School Computational Tools: Day 2 – VASP

On this second day of our spring school, the first ab initio solid state code is introduced: VASP, the Vienna Ab initio Simulation Package.

Having worked with this code for almost a full decade, some consider me an expert, and as such I had the dubious task of providing first contact with this code to our participants. Since all basic aspects and methods had already been introduced on the first day, I mainly focused on presenting the required input files and parameters, and showing how these should be tweaked for some standard type solid state calculations. Following this one-hour introduction, in which I apparently had not yet scared our participants too much, all participants turned up for the first hands-on session, where they got to play with the VASP program.

In the afternoon, we were delighted to welcome our first invited speaker, straight from the VASP-headquarters: Dr. Martijn Marsman. He introduced us to advanced features of VASP going beyond standard DFT. He showed the power (and limitations) of hybrid-functionals and introduced the quasi-particle approach of GW. We even went beyond GW with the Bethe-Salpeter equations (which include electron-hole interactions). Unfortunately, these much more accurate approaches are also much more expensive than standard DFT, but there is work being done on the implementation of a cubic scaling RPA implementation, which will provide a major step forward in the field of solid state science. Following this session, a second hands-on session took place where exercises linked to these more advanced topic were provided and eagerly tried by many of the more advanced participants.

Spring School Computational Tools: Day 1

Today our one-week spring school on computational tools for materials science kicked off. During this week, Kurt Lejaeghere and I host this spring school, which we have been busily organizing the last few months, intended to introduce materials scientists into the use of four major ab-initio codes (VASP, ABINIT, Gaussian and CP2K). During this first day, all participants are immersed in the theoretical background of molecular modeling and solid state physics.

springschool

Prof. Karen Hemelsoet presented a general introduction into molecular modeling, showing us which computational techniques are useful to treat problems of varying scales, both in space and time. With the focus going to the modeling of molecules she told us everything there is to know about the potential energy surface, (PES)  how to investigate it using different computational methods. She discussed the differences between localized (i.e. gaussian) and plane wave basis sets and taught us how to accurately sample the PES using both molecular dynamics and normal mode analysis. As a final topic she introduced us to the world of computational spectroscopy, showing how infrared spectra can be simulated, and the limitations of this type of simulations.

With the, somewhat mysterious, presentations of Prof. Stefaan Cottenier we moved from the  realm of molecules to that of solids. In his first session, he introduced density functional theory, a method ideally suited to treat extended systems at the quantum mechanical level. And showed that as much information is present in the electron density of a system as is in its wave function. In his second session, we fully plunged in the world of solids, and we were guided, step by step, towards a full understanding of the technical details generally found in the methods section of (ab-initio) computational materials science work. Throughout this session, NaCl was used as an ever present example, and we learned that our simple high-school picture of bonding in kitchen salt is a lie-to-children. In reality, Cl doesn’t gain an extra electron by stealing it away from Na, instead it is rather the Na 3s electron which is living to far away from the Na nucleus it belongs to.

De-activating an active atom.

It could be that I’ve perhaps found out a little bit about the structure
of atoms. You must not tell anyone anything about it. . .
–Niels Bohr (1885 – 1965),
in a letter to his brother (1912)

Getting the news that a paper got accepted for publication is exciting news, but it can also be a little bit sad since it indicates the end of a project. Little over a month ago we got this great news regarding our paper for the journal of chemical information and modeling. It was the culmination of a side project Goedele Roos and I had been working on, in an on-and-off fashion, over the last two years.

When we started the project each of us had his/her own goal in mind. In my case, it was my interest in showing that my Hirshfeld-I code could handle systems which are huge from the quantum mechanical calculation point of view. Goedele, on the other hand, was interested to see how good Hirshfeld-I charges behaved with increasing size of a molecular fraction. This is of interest for multiscale modeling approaches, for which Martin Karplus, Michael Levitt, and Arieh Warshel got the Nobel prize in chemistry in 2013. In such an approach, a large system, for example a solvated biomolecule containing tens of thousands of atoms, is split into several regions. The smallest central region, containing the part of the molecule one is interested in is studied quantum mechanically, and generally contains a few dozen up to a few hundred atoms. The second shell is much larger, and is described by force-field approaches (i.e. Newtonian mechanics) and can contain ten of thousands of atoms. Even further  from the quantum mechanically treated core a third region is described by continuum models.

What about the behavior of the charges? In a quantum mechanical approach, even though we still speak of electrons as-if referring to classical objects, we cannot point to a specific point in space to indicate: “There it is”. We only have a probability distribution in space indicating where the electron may be. As such, it also becomes hard to pinpoint an atom, and in an absolute sense measure/calculate it’s charge. However, because such concepts are so much more intuitive, many chemists and physicists have developed methods, with varying success, to split the electron probability distribution into atoms again. When applying such a scheme on the probability distributions of fractions of a large biomolecule, we would like the atoms at the center not to change to much when the fraction is made larger (i.e. contain more atoms). This would indicate that from some point onward you have included all atoms that interact with the central atoms. I think, you can already see the parallel with the multiscale modeling approach mentioned above; where that point would indicate the boundary between the quantum mechanical and the Newtonian shell.

Convergence of Hirshfeld-i charges for clusters of varying size of a biomolecule.

Convergence of Hirshfeld-I charges for clusters of varying size of a biomolecule. The black curves show the charge convergence of an active S atom, while the red curves indicate a deactivated S atom.

Although, we expected to merely be studying this convergence behavior, for the particular partitioning scheme I had implemented, we dug up an unexpected treasure. Of the set of central atoms we were interested all except one showed the nice (and boring) convergence behavior. The exception (a sulfur atom) showed a clear lack of convergence, it didn’t even show any intend toward convergence behavior even for our system containing almost 1000 atoms. However, unlike the other atoms we were checking, this S atom had a special role in the biomolecule: it was an active site, i.e. the atom where chemical reactions of the biomolecule with whatever else of molecule/atom are expected to occur.

Because this S atom had a formal charge of -1, we bound a H atom to it, and investigated this set of new fractions. In this case, the S atom, with the H atom bound to it, was no longer an active site. Lo and behold, the S atom shows perfect convergence like all other atoms of the central cluster. This shows us that an active site is more than an atom sitting at the right place at the right time. It is an atom which is reaching out to the world, interacting with other atoms over a very long range, drawing them in (>10 ångström=1 nm is very far on the atomic scale, imagine it like being able to touch someone who is standing >20 m away from you). Unfortunately, this is rather bad news for multiscale modeling, since this means that if you want to describe such an active site accurately you will need an extremely large central quantum mechanical region. When the active site is deactivated, on the other hand, a radius of ~0.5 nm around the deactivated site is already sufficient.

Similar  to Bohr, I have the feeling that “It could be that I’ve perhaps found out a little bit about the structure
of atoms.”, and it makes me happy.

Convergence of Atomic Charges with the Size of the Enzymatic Environment

Authors: Danny E. P. Vanpoucke, Julianna Oláh, Frank De Proft, Veronique Van Speybroeck, and Goedele Roos
Journal: J. Chem. Inf. Model. 55(3), 564-571 (2015)
doi: 10.1021/ci5006417
IF(2015): 3.657
export: bibtex
pdf: <J.Chem.Inf.Model.> 
Graphical Abstract: The influence of the cluster size and water presence on the atomic charge of active and inactive sites in Biomolecules.
Graphical Abstract: Graphical Abstract: The influence of the cluster size and water presence on the atomic charge of active and inactive sites in Bio-molecules.

Abstract

Atomic charges are a key concept to give more insight into the electronic structure and chemical reactivity. The Hirshfeld-I partitioning scheme applied to the model protein human 2-cysteine peroxiredoxin thioredoxin peroxidase B is used to investigate how large a protein fragment needs to be in order to achieve convergence of the atomic charge of both, neutral and negatively charged residues. Convergence in atomic charges is rapidly reached for neutral residues, but not for negatively charged ones. This study pinpoints difficulties on the road towards accurate modeling of negatively charged residues of large bio-molecular systems in a multiscale approach.

39th ICACC: day 3-5

The last three days of the conference, the virtual materials design session took place. This session was specifically focused on computational materials design. Because of this focus, the attendance was rather low, mainly computational materials scientists. Apparently, this type of specialized focus on computational work is the best way not to reach the general experimental public in the same field. As a computational scientist the only way to circumvent this, is by applying for a presentation in a relevant experimental session. This requires you to make a less technical presentation, but this is not a bad thing, since it forces you to think about your results and understand them in more simple terms.

An example of such a presentation was given Dr. Ong who discussed his high-throughput ab initio setup for designing solid state electrolytes. He showed that a material (Li10GeP2S12) which was thought to be a 1D conductor, actually is a 3D conductor, however, the Li-conductivity in the directions perpendicular to the 1D direction is ~100 times smaller, explaining why they were not noticed before. He also presented newly predicted materials for Li-transport, which lead to a standard experimental remark that such computational predictions mean very little, since they do not take into account temperature, and as such these structures may not be  stable. Although such remarks are “in theory” true, and are a nice example of a lack of understanding outside the computational community, in this case, the material the experimental researcher was referring to was recently synthesized and found to be stable.

On Thursday morning, I had the opportunity to present my contributed presentation. In contrast to my invited presentation, this presentation was solely focused on doped cerium dioxide. Using a three-step approach, we investigated all different contributions of the dopants to their modification of the mechanical properties of CeO2. In the first step, we look at group IV dopants, since Ce has an oxidation state of +IV in CeO2. Here we show that the character of the valence electrons (p or d) plays an important role with regard to stability and mechanical properties. In our second step dopants with an oxidation state different from +IV are considered, without the presence of oxygen vacancies. In this case, the same trends and behavior is observed as in the first step. In the third and last step also oxygen vacancies are included. We show that oxygen vacancies have a stabilizing influence of the doped system. Furthermore, the oxygen vacancies make CeO2 mechanically softer, i.e. reducing the bulk modulus.

In the afternoon, Prof. Frederico Rosei, of NanoFemto lab in Canada, gave an entertaining lecture on “Mentorship for young scientists: Developing scientific survival skills”. It presented an interesting forum to find out that, as scientists, we all seem to struggle with the same things. We want to do what we like (research), and invest a lot in this, unfortunately external forces (the struggle for funding/job security) complicate life. Frederico centered his lecture around three points of importance/goals a young scientist should always try to be aware of:

  • Know yourself
  • Plan ahead
  • Find a mentor

Although these are lofty goals, they tend to be quite none-trivial in the current-day scientific environment. Finding a mentor, i.e. a senior scientist with time on his/her hands not involved in your projects, is a bit like looking for a unicorn. Unlike the unicorn, they do exist, but there are very very few of them (How many professors do you know with spare time?). Planning ahead, and following your own plans are nice in theory, unfortunately a young scientists’ life (i.e. everyone below tenured professorship) tends to be ruled by funding in a kind of life or dead setup. I am not saying this is not the case for tenured professors, however, it is not their own life and death. For all other scientist : No funding=no job=end scientific career. As such, the pressure to publish (yes, funding agencies only count your papers, not how good/bad they are even though that is the official statement) is high, and will have a detrimental influence on the quality of science and of what is being published (if it isn’t already the case). I truly wished, the world could be as Prof. Rosei envisions it. Back to more happy subjects.

Friday was the last day of the conference. In the morning I again attended the virtual materials design session, but as with all other sessions several presentations were canceled, apparently snow is wreaking some havoc in New York airport, preventing several presenters not to be able to make it. Luckily Eva Zarkadoula made it to the conference to present her very nice modeling work: “Molecular Dynamic Simulations of Synergistic Effects in Ion Track Formation”. Using classical molecular dynamics simulations, she simulated how incoming high energy radiation traces a path trough a material allowing one to use this material as a detector. In contrast to what I would have imagined, perfectly crystalline material shows very little damage after the radiation has passed through. Even though initially a clear trace is visible, the system appear to relax back to a more or less perfect crystalline solid upon relaxation, making it a rather poor sensor material. However, if defects are present in the material, the track made by the radiation remains clearly visible. An extremely nice bonus to this work is the fact that direct comparison to experiments is possible.

The conference ended at noon, leaving some time to have a walk on the beach, find some souvenirs, and have a last dinner with colleagues from the conference.