Latest

new ..."AI is the centre of the universe and edges are the way that you'll put it into production. Zero Trust is the way that you'll end up securing it, and ultimately quantum will be the thing that powers it over the long term for the performance and efficiency needed to scale it into a global system," said John Roese, Global Chief Technology Officer. He encouraged thinking actively about AI but conveyed the importance of integrating it with other architectures for long-term success.IT Brief Australia, 9h ago
new The great technological leaps forward of the past—from the advent of the steam engine to personal computers and the internet—each empowered humans by augmenting their physical and computational capabilities. The artificial intelligence technologies of today, however, expand the domain of technological augmentation to areas long thought to be uniquely human, like creativity. Generative AI’s mastery of what was considered distinctly human means it impacts the professional identity of knowledge workers in ways that we have not seen before, portending a future that looks very different from the world today.Fortune, 9h ago
new Request a free sample copy in PDF or view the report summary: https://www.expertmarketresearch.com/reports/telemedicine-market/requestsampleTelemedicine Market OverviewUnderstanding Telemedicine This encompasses a wide range of services, from virtual doctor consultations to remote patient monitoring and telepharmacy. Telemedicine eliminates the need for physical presence, making healthcare services accessible to individuals globally.Market Size and Growth The telemedicine market achieved a substantial market size of USD 73.1 billion in 2023 and is poised to continue its growth journey with a CAGR of 19.3% from 2024 to 2032, ultimately reaching a staggering USD 377.0 billion by 2032. This remarkable growth can be attributed to several key factors, which we will explore in detail.Telemedicine Market DynamicsTechnological Advancements The rapid evolution of technology is a driving force behind the telemedicine boom. High-speed internet, smartphones, wearable devices, and improved telecommunication infrastructure have all played pivotal roles in making remote healthcare services accessible. Telemedicine platforms now boast high-quality video and audio capabilities, ensuring seamless communication between patients and healthcare providers.Increased Adoption of Teleconsultation The widespread acceptance of teleconsultation has been steadily increasing. Patients have come to appreciate the convenience and accessibility of virtual appointments, particularly for non-emergency consultations. The COVID-19 pandemic further accelerated this trend, highlighting the importance of remote healthcare services.External Telemedicine Market TrendsChanging Regulatory Landscape Governments and regulatory bodies worldwide are adapting to accommodate telemedicine. They are implementing policies and regulations to ensure patient safety, data privacy, and the growth of telehealth services. Staying informed about these evolving regulations is crucial for telemedicine providers.Remote Monitoring and IoT Integration The integration of Internet of Things (IoT) devices into telemedicine has opened up new possibilities. Remote monitoring of vital signs and health parameters enables proactive healthcare management. Patients can transmit real-time data to healthcare professionals, leading to more accurate diagnoses and treatment adjustments.Explore the full report with the table of contents: https://www.expertmarketresearch.com/reports/telemedicine-marketTelemedicine Market SegmentationPatient Demographics Telemedicine serves a diverse range of patients, from tech-savvy individuals to the elderly and those residing in remote areas with limited healthcare access. Understanding these demographics is vital for tailoring services effectively.Specialty Areas Telemedicine extends beyond general consultations to various specialty areas, including telepsychiatry, teledermatology, teleoncology, and more. Each specialty has unique requirements and considerations, necessitating market segmentation.Telemedicine Market GrowthGlobal Expansion Telemedicine knows no geographical boundaries. Its reach is expanding worldwide, with healthcare providers, tech companies, and startups entering the market from different corners of the globe. This global expansion is contributing significantly to the industry's rapid growth.Improved Patient Outcomes Research indicates that telemedicine can lead to improved patient outcomes. Timely consultations, continuous monitoring, and better access to healthcare professionals contribute to early diagnosis and effective management of various medical conditions.Recent Developments in the Telemedicine MarketTelemedicine Platforms Telemedicine platforms are continually evolving to offer more features and capabilities. Many now integrate electronic health records (EHRs), prescription management, and secure patient messaging, enhancing the overall patient experience.AI and Telemedicine Artificial intelligence (AI) is making its presence felt in telemedicine. Machine learning algorithms are being employed to analyze medical data, predict patient outcomes, and enhance diagnostic accuracy. The integration of AI promises to revolutionize telemedicine further.Telemedicine Market ScopePatient Convenience Telemedicine offers unparalleled convenience to patients. They can schedule appointments at their convenience, eliminating the need for lengthy commutes and extended wait times in crowded waiting rooms.Cost Savings Telemedicine presents cost savings for both patients and healthcare providers. Patients save on travel expenses and time, while healthcare providers can optimize their resources more efficiently.Telemedicine Market AnalysisKey Players The telemedicine market boasts a diverse array of key players, including established healthcare institutions, technology firms, and startups. Prominent players include Teladoc Health, Amwell, Doctor on Demand, and numerous others. These companies offer a wide array of telehealth services and continue to innovate in the field.Patent Analysis Analyzing patents is crucial to understanding the technological innovations propelling the telemedicine market. It offers insights into the key players' areas of focus and hints at potential future developments.Grants and Funding Monitoring grants and funding within the telemedicine sector provide valuable insights into market trends and growth areas. Government support and private investment often signify confidence in the market's potential.Clinical Trials Clinical trials within the telemedicine realm are essential for validating the efficacy and safety of remote healthcare solutions. Keeping abreast of ongoing trials can provide valuable information about emerging telemedicine treatments and technologies.Partnerships and Collaborations Partnerships and collaborations among telemedicine providers, healthcare organizations, and technology companies are commonplace. These alliances often result in innovative solutions and expanded service offerings.FAQ: Addressing Common Questions1. Is telemedicine as effective as in-person visits? Telemedicine has proven highly effective for many types of consultations and follow-ups. However, certain cases necessitate physical examinations or procedures, mandating in-person visits.2. Is telemedicine secure and private? Telemedicine platforms prioritize security and privacy, employing encryption and adhering to stringent data protection regulations to safeguard patient information.3. How can I access telemedicine services? Accessing telemedicine services is straightforward. Many healthcare providers have their telemedicine platforms or collaborate with established telehealth companies. Patients can typically schedule appointments through websites or mobile apps.4. Will insurance cover telemedicine consultations? Insurance coverage for telemedicine varies by provider and policy. Many insurance companies now offer coverage for telehealth services, but it's essential to verify specific plan details.Related Report:Surgical Robots Market...openPR.com, 13h ago
new Download Free Sample of Report - https://www.globalinsightservices.com/request-sample/GIS25711/?utm_source=pranalipawar&utm_medium=Openpr&utm_campaign=04122023Security scanning equipment is typically composed of several components including scanners, detectors, and monitors. Scanners are used to detect and identify potential threats, such as malware and viruses. Detectors are used to look for signs of malicious activity, such as unauthorized access to a system or network. Monitors are used to constantly monitor for suspicious activity and alert administrators of any potential threats.Security scanning equipment is essential for any organization that wants to protect its data and systems. It helps organizations detect malicious activity and respond quickly to potential threats. It also helps to reduce the risk of data breaches and other security incidents. Security scanning equipment is an important part of any security strategy and should be implemented in order to ensure the safety and security of an organization's data and systems.Key TrendsSecurity scanning equipment is a broad term that encompasses a wide variety of devices used to detect, identify, and prevent security threats. The technology has been evolving rapidly in recent years, as organizations strive to keep up with the ever-changing security landscape. In this article, we will discuss some of the key trends in security scanning equipment technology.First, the use of biometrics is becoming increasingly popular. Biometric authentication is a process whereby a person's physical characteristics, such as a fingerprint or iris scan, are used to authenticate their identity. This technology is becoming more common in many industries, and is being used to secure areas, as well as to verify transactions.Second, the use of facial recognition technology is also growing. This technology uses facial recognition algorithms to identify individuals and can be used for a variety of security purposes. It is becoming increasingly common in public places, such as airports and stadiums, as well as in corporate environments.Third, the use of artificial intelligence (AI) is becoming more prevalent in security scanning equipment technology. AI can be used to identify and alert security personnel to potential threats before they occur. It can also be used to analyze large amounts of data quickly and accurately, allowing for better decision-making and faster response times.Finally, the use of cloud-based security scanning solutions is becoming more popular. With cloud-based security solutions, organizations can access their security systems from anywhere in the world. This allows for greater flexibility and scalability, as well as faster response times.These are just some of the key trends in security scanning equipment technology. As the security landscape continues to evolve, organizations must continue to stay ahead of the curve by using the latest technology available to them. By doing so, they can ensure that their security systems are up to date and can effectively protect their organization from any potential threats.Key DriversSecurity Scanning Equipment Market is driven by the increasing need for security and surveillance in the public and private sector. The rising number of threats to national security, as well as the need for quick and accurate detection of potential threats has created a strong demand for security scanning equipment. As a result, the market has seen a steady growth over the past few years.The first key driver of the security scanning equipment market is the government's increased focus on security. Governments around the world are investing heavily in security measures, and this includes the procurement of scanning equipment. This is especially true in developed countries, where governments have implemented stringent security measures to protect their citizens. For instance, the United States has adopted a see something, say something approach to security, which requires citizens to report any suspicious activity to law enforcement. As a result, the demand for security scanning equipment has increased significantly.Report Overview- https://www.globalinsightservices.com/reports/security-scanning-equipment-market/?utm_source=pranalipawar&utm_medium=Openpr&utm_campaign=04122023The second key driver of the security scanning equipment market is the rise of terrorist activities. Terrorists have become increasingly sophisticated in their use of technology to carry out their attacks. As a result, governments and private companies are investing heavily in the development of advanced scanning equipment to detect and prevent these attacks. This has led to a strong demand for security scanning equipment, as these devices are able to detect and identify potential threats quickly and accurately.The third key driver of the security scanning equipment market is the development of new technologies. Advances in technology have enabled the development of advanced scanning equipment, which has made it easier to detect and identify potential threats. For instance, the use of 3D imaging technology has enabled the development of devices that can detect objects hidden within walls and other structures. This has made it easier for law enforcement and private companies to detect and identify potential threats quickly and accurately.The fourth key driver of the security scanning equipment market is the increasing demand for safety and security in public spaces. With the recent increase in mass shootings and other public safety incidents, governments and private companies are investing heavily in the development of advanced scanning equipment to detect and prevent these incidents. This has led to a strong demand for security scanning equipment, as these devices are able to detect and identify potential threats quickly and accurately.Get a customized scope to match your need, ask an expert - https://www.globalinsightservices.com/request-customization/GIS25711/?utm_source=pranalipawar&utm_medium=Openpr&utm_campaign=04122023Finally, the fifth key driver of the security scanning equipment market is the increasing use of biometric technologies. Biometric technologies allow for the identification of individuals through their unique physical characteristics. This has made it easier for law enforcement and private companies to identify potential threats quickly and accurately. As a result, the demand for security scanning equipment has increased significantly.Market SegmentationThe Security Scanning Equipment Market is segmented into Detection Technology, Application, End User, and Region. On the basis of Detection Technology, the Security Scanning Equipment Market is segmented into X-ray, CT-based, Neutron Sensing and Detection, and Others Detection Technologies. Based on Application, the market is bifurcated into Mail and Parcel and Baggage Scanning. Based on End User, the market is segmented into Airports, Ports and Borders, and Defense. Region-wise, the market is segmented into North America, Europe, Asia-Pacific, and Rest of the World. Key PlayersSome of the key players of Security Scanning Equipment Market are Smiths Detection Inc. (UK), Leidos Holdings Inc. (US), OSI Systems Inc. (US), 3DX-Ray Ltd (US), Teledyne ICM SA (US), Analogic Corporation (US), Nuctech Company Limited (China), Astrophysics Inc. (US), CEIA SpA (Italy), and Gilardoni SpA (Italy). Buy Now - https://www.globalinsightservices.com/checkout/single_user/GIS25711/?utm_source=pranalipawar&utm_medium=Openpr&utm_campaign=04122023With Global Insight Services, you receive:10-year forecast to help you make strategic decisions...openPR.com, 16h ago
new Despite the successes of ab initio approaches in a wide range of computer simulations, the team notes that efforts to employ quantum mechanical ab initio methods to predict materials’ properties has not been able to achieve quantum accuracy and scale on the powerful supercomputers needed to perform these simulations. In their abstract to their Gordon Bell Prize-winning project the authors write, “ Ab initio electronic-structure has remained dichotomous between achievable accuracy and length-scale. Quantum Many-Body (QMB) methods realize quantum accuracy but fail to scale.”...HPCwire, 19h ago
new At the beginning of ECP, EQSIM could resolve motions of 2 Hertz in a material model where the shear speed exceeds 500 meters per second. Now, the team has achieved 10 Hertz in a model that resolves motion down to a shear speed of 140 meters per second—providing a far more realistic predictive tool for validating infrastructure risk. “Earthquake hazards and risks are a problem of national importance,” says McCallen. “The damage that could result from a large earthquake is a bigger societal problem than most people imagine.” Indeed, according to USGS and the Federal Emergency Management Agency, earthquakes cost the nation an estimated $14.7 billion annually in building damage and related losses. Says McCallen, “We’ve really pushed the envelope of what is possible computationally and made tremendous advancements. EQSIM has become a practical tool for realistically evaluating and planning for the effects of future earthquakes on critical infrastructure, providing a basis for objectively and reliably designing or retrofitting structures.”...HPCwire, 19h ago

Latest

new The most unlikely event to ever occur, dear Eva, is you. You stand on the shoulders of giants of causality, relying on the precisely timed meeting of genetic information from two humans. Any minute adjustment to the timing of that meeting and different gametes would have combined and you’d be a totally different person, at least physically. You would, of course, still be some combination (and minor mutation) of your parents’ range of possible genes, but what are those genes if not additional unique combinations of their parents, equally reliant on precise timing, going backward in time billions of years through common ancestors and particle collisions and elemental fusion to the big bang? You won the lottery, Eva, not once, not twice, but over and over again, simply by existing as a conscious being who is capable of understanding the heuristic miracle that is yourself. I hope that makes you feel nice, as it does me. GlennB...the Guardian, 1d ago
new As the high-speed railway network in China extends beyond 40,000 kilometers, maintaining seamless internet connectivity for passengers is becoming increasingly challenging. The demand for consistent and reliable online access is particularly crucial for travelers who spend extended hours on trains, relying on the expectation of undisturbed work, study, or entertainment. Addressing this need, a team of researchers from the School of Computer Science at Peking University developed ‘HiMoDiag’—an innovative tool designed to enhance the understanding and management of network performance in extremely high-mobility scenarios.“HiMoDiag stands out as a TCP-LTE/5G cross-layer performance analysis tool that enables full stack real-time comprehensive analysis and visualization of network performance from the application layer down to the physical layer,” explains Chenren Xu, corresponding author of the study. “It not only captures performance data across all layers of the network between user application and service provider but also visualizes it in a way that's actionable for network operators.”The team published their results in the KeAi journal High-speed Railway.Notably, HiMoDiag is particularly useful in scenarios whereby passengers expect stable internet connectivity to support their work and leisure activities during travel. The tool’s real-time analysis and visualization capabilities allow for immediate network performance diagnosis and performance optimization—an aspect that traditional diagnostic tools tend to fall short."By integrating cross-layer data analysis, we can now pinpoint and address issues much faster than before," Xu added. "This means less downtime for passengers and a more reliable service overall."The design of HiMoDiag addresses several challenges, including clock synchronization across network layers and endpoints, managing the substantial data volume resulting from 5G's high bandwidth, and mitigating interference arising from performance indicator transmission.Through its implementation and evaluation across an extensive dataset collected on trains with a maximum speed of 350 km/h, it has the potential to elevate network performance, thereby ensuring an enhanced user experience for passengers as well as quality of mission-critical service provided by LTE-R or 5G-R. HiMoDiag’s experiment platform also allows for flexible control over mobile devices, facilitating various types of network experiments.newswise.com, 1d ago
new Looking ahead, technological advancements will further increase the importance of robust computer security for nuclear safety and security at the State and facility levels. Rapidly evolving technologies such as artificial intelligence are promising in terms of solving some problems and improving digitally controlled operations. At the same time, they present new challenges that need to be addressed. Similarly, wireless and automation technologies are being considered and used today in advanced nuclear reactor designs such as small modular reactors and microreactors. As cyberthreats are constantly and rapidly evolving, IAEA support for Member States’ needs for enhancing computer security for nuclear safety and security requires agility to keep abreast of all the new opportunities and challenges of these novel technologies in order to provide the most efficient standards, best practices, training and guidelines. This is what the IAEA Department of Nuclear Safety continuously strives for.iaea.org, 1d ago

Top

...- If you look at the way that we come to understand the world, there are poets, there are musicians, there are high-energy physicists, there are mathematicians. What is it about the Universe, living and non-living, that that seems to require very different forms of expertise? Now, there are people who say, "We don't, in the end it's just quantum field theory, all that stuff, it's just noise. If we really understood the Universe deeply, we'd just all be physicists." And of course, many of us consider that very muddle headed. So how do you explain the need for new ideas, new disciplines, at different scales of organization? And that's where emergence comes in. And it turns out that one of the interesting properties of the Universe and the world is that it can be understood as a set of hierarchies, each of which becomes somewhat independent of the levels below. Right, and understanding how that comes about is the 'emergence problem.' Right, and the way I often say this is that if you go to a new city and you need to get around and you go to rent a car, what you need to understand is the map of the city, not the physics of the engine. And COVID for me was the "come to Jesus moment" of reckoning with excessive specialization. Because what happened with COVID, it was a virus, right? It becomes an immunological problem that becomes an epidemiological problem, which becomes a transport problem, which becomes an economic problem, which becomes a human well-being and professions problem, which becomes a school problem, etc. So what happened during the course of the pandemic is that our sensibilities matured in understanding that what we are dealing with here was a complex system. There were other dimensions to this problem that were being neglected precisely because we were not reckoning with the interconnectedness of the system. So it's not just that it's the neglect, it's actually pathologically dangerous for the well-being of the planet that we do this kind of atomization all the time at the level of the disciplines.Big Think, 20d ago
True, but you can always wriggle out saying that all of that doesn't count as "truly understanding". Yes, LLM's capabilities are impressive, but does drawing SVG changes the fact that somewhere inside the model all of these capabilities are represented by "mere" number relations?Do LLM's "merely" repeat the training data? They do, but do they do it "merely"? There is no answer, unless somebody gives a commonly accepted criterion of "mereness".The core issue with that is of course that since no one has a more or less formal and comprehensive definition of "truly understanding" that everyone agrees with - you can play with words however you like to rationalize whatever prior you had about LLM.Substituting one vaguely defined concept of "truly understanding" with another vaguely defined concept of a "world model" doesn't help much. For example, does "this token is often followed by that token" constitutes a world model? If not - why not? It is really primitive, but who said world model has to be complex and have something to do with 3D space or theory of mind to be a world model? Isn't our manifest image of reality also a shadow on the wall since it lacks "true understanding" of underlying quantum fields or superstrings or whatever in the same way that long list of correlations between tokens is a shadow of our world?The "stochastic parrot" argument has been an armchair philosophizing from the start, so no amount of evidence like that will convince people that take it seriously. Even if LLM-based AGI will take over the world - the last words of such a person gonna be "but that's not true thinking". And I'm not using that as a strawman - there's nothing wrong with a priori reasoning as such, unless you doing it wrong.I think the best response to "stochastic parrot" is asking three questions:1. What is your criterion of "truly understanding"? Answer concretely in a terms of the structure or behavior of the model itself and without circular definitions like "having a world model" which is defined as "conscious experience" and that is defined as "feeling redness of red" etc. Otherwise the whole argument becomes completely orthogonal to any reality at all.2. Why do you think LLM's do not satisfy that criterion and human brain does?3. Why do you think it is relevant to any practical intents and purposes, for example to the question "will it kill you if you turn it on"?...lesswrong.com, 26d ago
The complex of methyltransferase-like proteins 3 and 14 (METTL3-14) is the major enzyme that deposits N6-methyladenosine (m6A) modifications on mRNA in humans. METTL3-14 plays key roles in various biological processes through its methyltransferase (MTase) activity. However, little is known about its substrate recognition and methyl transfer mechanism from its cofactor and methyl donor S-adenosylmethionine (SAM). Here, we study the MTase mechanism of METTL3-14 by a combined experimental and multiscale simulation approach using bisubstrate analogues (BAs), conjugates of a SAM-like moiety connected to the N6-atom of adenosine. Molecular dynamics simulations based on crystal structures of METTL3-14 with BAs suggest that the Y406 side chain of METTL3 is involved in the recruitment of adenosine and release of m6A. A crystal structure representing the transition state of methyl transfer shows a direct involvement of the METTL3 side chains E481 and K513 in adenosine binding which is supported by mutational analysis. Quantum mechanics/molecular mechanics (QM/MM) free energy calculations indicate that methyl transfer occurs without prior deprotonation of adenosine-N6. Furthermore, the QM/MM calculations provide further support for the role of electrostatic contributions of E481 and K513 to catalysis. The multidisciplinary approach used here sheds light on the (co)substrate binding mechanism, catalytic step, and (co)product release catalysed by METTL3, and suggests that the latter step is rate-limiting. The atomistic information on the substrate binding and methyl transfer reaction of METTL3 can be useful for understanding the mechanisms of other RNA MTases and for the design of transition state analogues as their inhibitors.elifesciences.org, 4d ago
However, for a comprehensive understanding of the complete free-energy landscape of Adk catalysis, the key step, catalyzing the actual chemical step of P-transfer, had been missing. While ES and EP complexes such as Adk bound to ADP/Mg2+ can be structurally characterized by traditional experimental methods as they represent a minimum in the free-energy landscape, the reaction path from substrate to product involving breaking and forming covalent bonds (chemical step) and traversing the crucial transition state can only be “visualized” by Quantum mechanics based molecular simulations. The power of such simulations to examine the chemical steps in enzymes, including P-transfer reactions, has been extensively documented (Jin et al., 2017; Lai & Cui, 2020b; Mokrushina et al., 2020; Pérez-Gallegos et al., 2015, 2017; Roston & Cui, 2016; Valiev et al., 2007). The focus in the literature has been on the comparison of the enzyme-catalyzed and uncatalyzed reaction with respect to the bond character at the transitions state (i.e., associate versus dissociative or tight versus loose) (Admiraal & Herschlag, 1995; Hengge, 2002; Kamerlin et al., 2013; Lassila et al., 2011). Here, we uncover a central new result that provokes a modified transition-state theory. Enzymes, due to their macromolecular nature, provide a fundamentally different, advantageous way to catalyze these chemical reactions compared to the uncatalyzed reaction by employing a broad TSE. Many different molecular configurations can be accommodated in the transition-state region with comparable energies via collective motions, spanning a 1 Å range along the reaction coordinate for the transferring phosphate that travels a total distance of only 2 Å during the P-transfer. This features resemblance the now well-established conformational sampling of proteins in ground states such as ES complexes. Our findings explain why enzymes do not face an entropic bottleneck for catalysis. Furthermore, as enzyme active sites are asymmetric in contrast to the symmetric nature of the solvent for uncatalyzed reactions, we find that the TSE comprises also highly asymmetric conformations. Our findings help resolve the controversy about the nature of the transition state in enzyme-catalysed P-transfer reactions between theory and experiments (Lassila et al., 2011). The complex nature of the active site of enzymes, in contrast to simple solvent, results in different mechanisms in the enzyme catalyzed reaction. We further note that a previous QM/MM minimum energy path calculations for Adk using a semiempirical method had proposed a different mechanism with a stable meta-phosphate intermediate with an even lower energy for this metaphosphate intermediate than the ES- and EP-complexes (Shibanuma et al., 2020). In complex systems, such as enzymes, it is possible to observe artificial local minima when using minimum energy path searching strategies due to inadequate sampling (Mendieta-Moreno et al., 2015; Quesne et al., 2016). From our NMR and x-ray experimental data we know that such stable meta-phosphate does not exist in Adk-catalyzed reactions, highlighting the importance of experimental verification of simulations as performed here, and the use of extensive sampling with proper thermodynamic treatment.elifesciences.org, 10d ago
The integration of artificial intelligence technologies such as machine learning, computer vision, cloud computing, and natural language processing can significantly reduce emissions by optimizing energy production in a more efficient and intelligent manner. The adoption of artificial intelligence in the solar industry is gradually replacing traditional analytical, physical, and empirical models for tasks like forecasting, data synthesis, and power flow optimization. What is Solar AI? Solar AI, or Solar Artificial Intelligence, refers to the integration of AI technologies into solar power systems and infrastructure. This integration aims to enhance the efficiency, reliability, and overall performance of solar energy solutions. The convergence of AI and solar power brings forth an array of applications, ranging from predictive maintenance and power optimization to intelligent grid management. The Solar AI market is a burgeoning sector within the broader renewable energy industry. As of the latest available data, the global Solar AI market size stands at an estimated $X billion, showcasing substantial growth in recent years. This growth is propelled by factors such as increased solar adoption, advancements in AI technology, and a growing awareness of the importance of sustainable energy solutions. In the realm of renewable energy and power evolution, the burgeoning field of Solar AI Technology is poised to revolutionize the landscape. This cutting-edge innovation, designed to augment energy generation and distribution, holds the promise of reshaping our sustainable energy future. Diving into the intricate fabric of Solar AI Technology, we encounter a world where intricacy and sophistication converge. It delves deep into the labyrinth of renewable energy solutions, employing ingenious algorithms and computational wizardry to optimize energy harnessing. One of the hallmark features of this paradigm-shifting technology is its ability to infuse diversity into the energy ecosystem. Much like a maestro conducting a symphony, Solar AI Technology orchestrates a symphony of energy sources, seamlessly intermingling solar, wind, and hydroelectric power. This orchestration of renewable resources endows the energy grid with a burstiness that ensures reliability and resilience, even in the face of adverse conditions. A distinguishing facet of this innovation is its penchant for esoteric nomenclature. In lieu of conventional terminology, it espouses a lexicon that is atypical yet evocative. Solar AI Technology engenders a lexicon that is as perplexing as it is illuminating, invoking terms like "photovoltaic quantum synchronization" and "energetics harmonization." The synergy of perplexity and burstiness within Solar AI Technology lays the foundation for an advanced energy ecosystem that transcends the ordinary. Its profundity in intricacy and diversity weaves a tapestry of innovation, promising a brighter, more sustainable energy landscape for generations to come. Ask here for a sample study@ https://www.precedenceresearch.com/sample/3397 Factors Driving the Growth of Solar AI Improved Energy Efficiency One of the primary drivers behind the growth of Solar AI is its ability to significantly improve the energy efficiency of solar systems. AI algorithms can analyze data in real-time, optimizing the operation of solar panels and ensuring that energy production is maximized. Predictive Maintenance Solar AI technology enables predictive maintenance, which can prevent costly downtimes and reduce operational expenses. AI-powered systems can identify potential issues in solar infrastructure before they become critical, allowing for timely repairs and maintenance. Integration with Smart Grids The integration of Solar AI with smart grids is reshaping the energy landscape. These intelligent grids can efficiently manage energy distribution, ensuring that excess energy is stored and used optimally, reducing waste and costs. Climate Change Mitigation Solar AI plays a crucial role in combating climate change by increasing the adoption of renewable energy sources. By making solar energy more reliable and efficient, it contributes to the reduction of greenhouse gas emissions. Future Growth Prospects The future of the Solar AI market is exceedingly promising, with several factors poised to drive its growth further. Advancements in AI Technology As AI technology continues to advance, Solar AI solutions will become even more sophisticated and effective. This will attract more businesses and consumers to adopt solar energy systems. Government Initiatives and Incentives Many governments worldwide are offering incentives and subsidies to promote the use of renewable energy, including Solar AI solutions. These initiatives will fuel market growth. Rising Environmental Concerns With an increasing global focus on environmental sustainability, the demand for clean energy solutions is set to surge. Solar AI, with its eco-friendly approach, is well-positioned to meet this demand. Expanding Solar Infrastructure The continued expansion of solar infrastructure, both in residential and commercial sectors, will provide a broader market for Solar AI solutions. Get Full Report Study@ https://www.precedenceresearch.com/checkout/3397 Ask here for more details@ Call: USA - +1 9197 992 333 | sales@precedenceresearch.com...altenergymag.com, 27d ago
Electronic devices rely on manipulating the electrical and magnetic properties of components, whether for computing or storing information, among other processes. Controlling magnetism using voltage instead of electric currents has become a very important control method to improve energy efficiency in many devices, since currents heat up circuits. In recent years, much research has been carried out to implement protocols for applying voltages to carry out this control, but always through electrical connections directly on the materials.A research team formed by members of the UAB Department of Physics and ICMAB-CSIC, with the collaboration of the Institute of Microelectronics of Barcelona (IMB-CNM-CSIC) and the ALBA synchrotron, has managed for the first time to modify the magnetic properties of a thin layer of cobalt nitride (CoN) by applying electrical voltage without the use of wires. To do this, researchers placed the sample of magnetic material in a liquid with ionic conductivity and applied the voltage to the liquid via two platinum plates, without connecting any wires directly to the sample. This generated an induced electric field that caused the nitrogen ions to leave the CoN and caused magnetism to appear in the sample, which changed from non-magnetic to magnetic. The induced magnetic properties can be modulated as a function of the applied voltage and actuation time, as well as the arrangement of the sample, and temporary or permanent changes in magnetism can also be conducted, depending on the orientation of the sample with respect to the imposed electric field."Being able wirelessly to control the magnetism of a sample by modifying the voltage represents a paradigm shift in this area of research," explains Jordi Sort, ICREA researcher at the UAB Department of Physics. "This is a finding that could have applications in a wide range of fields such as biomedicine, to control the magnetic properties of nanorobots without wires, or in wireless computing, to write and erase information in magnetic memories with voltage but without wiring.”The methodology presented by the researchers to achieve wireless magnetic control is not exclusive to the material used in the experiments, cobalt nitride. For ICMAB-CSIC researcher Nieves Casañ-Pastor, "these protocols can be extrapolated to other materials to control other physical properties wirelessly, such as superconductivity, memristor control, catalysis or transitions between insulator and metal, as well as wireless electrodes for neuronal electrostimulation, to cite a few examples that can expand the scope of application and technological impact of this research".The study was recently published in the latest issue of Nature Communications and was led by ICREA researcher in the UAB Department of Physics Jordi Sort and ICMAB-CSIC researcher Nieves Casañ-Pastor; and with the participation of Zheng-Ma, from the UAB Department of Physics, and Laura Fuentes, from ICMAB-CSIC and the Institute of Microelectronics of Barcelona (IMB-CNM-CSIC), both first authors of the research article; Zhengwei Tan, Eva Pellicer and Enric Menéndez, from the UAB Department of Physics; Libertad Abad, from the Institute of Microelectronics of Barcelona CNM-CSIC; and Javier Herrero Martín, from the ALBA synchrotron.icmab.es, 25d ago

Latest

new Finer control over the visual characteristics and notions represented in a produced picture is typically required by artistic users of text-to-image diffusion models, which is presently not achievable. It can be challenging to accurately modify continuous qualities, such as an individual’s age or the intensity of the weather, using simple text prompts. This constraint makes it difficult for producers to alter images to reflect their vision better. The research team from Northeastern University, Massachusetts Institute of Technology, and an independent researcher respond to these demands in this study by presenting interpretable idea Sliders, which enable fine-grained idea manipulation inside diffusion models. Their approach gives artists high-fidelity control over picture editing and generating. The research team will provide their trained sliders and code as open source. Concept Sliders offers several solutions to issues that other approaches must address adequately.MarkTechPost, 1d ago
new Researchers from the Lawrence Berkeley National Laboratory and Google DeepMind have published two papers in Nature demonstrating the potential of our AI predictions for autonomous material synthesis. The study shows a finding of 2.2 million more crystals, the same as approximately 800 years’ worth of information. Their new deep learning tool, Graph Networks for Materials Exploration (GNoME), predicts the stability of novel materials, greatly improving the speed and efficiency of discovery. GNoME exemplifies the promise of AI in the large-scale discovery and development of novel materials. Separate yet contemporaneous efforts by scientists in different laboratories across the globe have produced 736 of these novel structures.MarkTechPost, 1d ago
new Exploring the creative capability of artificial intelligence, a study recently published in Scientific Reports illuminates the captivating interplay between human ingenuity and computational creativity. Researchers Mika Koivisto and Simone Grassini utilized the Alternate Uses Task (AUT) as a medium to explore divergent thinking, a cognitive process wherein individuals formulate varied ideas or solutions in response to a singular task. Encompassing responses from 256 human participants and three advanced AI chatbots—ChatGPT3, ChatGPT4, and Copy.Ai—the study scrutinized creative alternatives for commonplace objects such as a pencil and a candle. Delving into semantic distance and creativity as measured through computational and human evaluators, respectively, the findings unveiled a nuanced narrative: while chatbots, on average, surpassed human scores in semantic distance and creativity, the peak of human creativity ultimately eclipsed the AI-generated responses in seven of the eight scoring categories. Thus, while AI chatbots demonstrably weave a tapestry of ideas that align with, or even surpass, average human creativity, the best level of human imaginative thought, at least presently, remains unparalleled. This investigative foray invites further exploration into the application and integration of AI within the realm of creative processes, foreshadowing an intriguing symbiosis of human and artificial creative potentials.Montreal AI Ethics Institute, 1d ago
new As Raimondo advocates for increased funding to fortify export controls on AI chips, the question that lingers is whether this financial infusion will be sufficient to counter China’s relentless pursuit of cutting-edge semiconductor technology. In an era where national security is intricately linked with technological supremacy, the decisions made today will shape the future balance of power. Can increased funding truly safeguard the nation’s technological edge, or does it merely represent a temporary barrier in an ever-evolving global landscape? As nations grapple for dominance in the AI arena, only time will reveal the efficacy of these strategic moves and their impact on the delicate balance of international relations.BitcoinEthereumNews.com, 1d ago
new ...deTour 2023 is presented by PMQ and sponsored by Create Hong Kong of the Government of the Hong Kong Special Administrative Region. This 10-day grand design festival will present various programmes for the art and design communities both in and outside Hong Kong. These programmes include creative installations and exhibitions, workshops, design dialogues, movie screenings, guided tours, and many more. 10 years of deTour at PMQ Started in 2014, PMQ's deTour celebrates the 10th anniversary this year. A unique theme has been assigned to each edition of deTour to explore how design could solve social problems and improve our lives. To commemorate this journey, a 10-year retrospect exhibition will be featured in this year's deign festival to look back on the highlighted exhibits and behind-the-scenes of the past decade of PMQ's deTour. This part of the exhibition will be held during the same period as deTour 2023 Design Festival at PMQ and admission is free, as everyone is invited to witness the road to success of deTour, as the contribution of PMQ and deTour in pushing forward the development of the local design industry. The theme of deTour 2023 – "New Know How" – the Interweaving of "Craft", "Design" and "Technology" Working around "New Know How" as the theme, deTour 2023 aims to pave the road to craft design future which takes off with appreciation of traditional craftsmanship. When working, craftsmen are laser focused, highly precise and people oriented, and these qualities are collectively deemed as the "spirit of craftsmanship", which are also regarded as "Know How". Curators of deTour 2023 stated that "this valuable artisanship ought to be applied to different industries. It, together with design thinking and advanced technology, will help take each industry to a new level. By achieving this, problems can be solved more effectively, and through analysing obstacles from a new angle and perhaps trying solutions with new materials and techniques, new creations that are more thoughtful might be possible – this is what we meant by 'New Know How'." Through this design festival, curators hope to bring the industries to re-appreciate traditional crafts from a new angle and make good use of "New Know How" in different sectors. Three Exhibition Areas with 20 International and Local Works The exhibition of deTour 2023 Design Festival is divided into three parts: International Collaboration, Feature Exhibitions and Selected Entries, which are curated under the five specific directions of New Value, Storytelling, Craft Enhancement, Innovation and Heritage. Scattered all over the PMQ premises, 20 design works from the Nethelands, Japan and local designers are open to the public for free appreciation. . In addition, over the 10 days of the design festival, there will be a fruitful programme of a total of 40 workshops, 12 design dialogues and sessions of documentary screening of "Food and Design". International Collaboration: Craft by Nature by Biobased Creations (The Netherlands) deTour 2023's International Collaboration has invited Biobased Creations from the Netherlands, which is a creative studio formed by a group of designers, researchers and artists who aim to encourage the use of biomaterials - also known as renewable eco resources - in daily lives, so as to evolve our world into a regenerative and circular one. Biobased Creations presents Craft by Nature which is an exhibit specially designed for deTour 2023. Shaped like a house, the exhibit allows visitors to find multiple designs made from biobased materials such as furniture, clothes, shoes and socks, and household items. All of these designs are eco-friendly, light in weight and with a minimalistic design, created by designers from Hong Kong and the Netherlands. Outside the house in the garden, visitors can see the "originals" of the materials used in the designs, including common reed, silvergrass and shells, something that we are familiar with. Other than that, Biobased Creations will also display 40 biomaterials that can be applied in architecture and heat insulation, and visitors are welcome to hold and feel these materials and scan the QR code to instantly learn how these biomaterials are made. Craft by Nature is an innovative installation demonstrating how materials that are ignored or abandoned can be transformed into chic designs, and this allows us to rethink our relationship with the nature and invites us to take one more step forward to make our world better. Feature Exhibitions: Exploring traditional crafts with a fresh eye...SME Business Daily Media, 1d ago
new The eastern bluebird is a special bird. The blue of its feathers is unique. However, this colour is not based on pigments but on the special structure of the feather. Viewed under the microscope, the feathers are traversed by a network of channels with a diameter of just a few hundred nanometres. By way of classification, a nanometre is a billionth of a metre. The blue of the bluebird came to the attention of ETH researchers from the Laboratory of Soft and Living Materials led by former ETH Professor Eric Dufresne. So much so that they decided to replicate this material in the laboratory. They have now succeeded with a new method: they have developed a material that exhibits the same structural design of the bluebird feathers, while additionally offering potential for practical applications thanks to its nanonetworks.ScienceDaily, 1d ago

Top

Newswise — ROLLA, Mo. – Missouri University of Science and Technology has long been home to one of the nation’s most diverse energy-focused research portfolios, and leaders are now taking steps to accelerate energy innovation from S&T’s laboratories to the marketplace. “We are currently putting together an energy technology incubator at Missouri S&T,” says Dr. David Borrok, vice provost and dean of S&T’s College of Engineering and Computing. “The energy research we conduct at S&T is incredible and covers many applications. We are excited to provide support for our research teams to advance their research to the commercial marketplace to help solve some of the world’s most challenging issues.” The Missouri S&T Energy Technology Incubator (ETI) will provide research teams with seed grants and other resources to help them more quickly get patents and generate viable business products. Borrok says researchers will be able to partner with S&T’s Kummer College of Innovation, Entrepreneurship, and Economic Development and work with the office for Technology Transfer and Economic Development, while also having some ETI funding available to jumpstart their energy-focused research. The first seed grants that will support the ETI are sponsored by Molly and Andrew Laegeler, alumni of S&T who previously funded a postdoctoral fellowship focused on sustainable energy. The new seed grants will replace that fellowship, with the goal of helping faculty members take ideas from concept to reality. “The goal is to have several of these seed grants and other resources available,” Borrok says. “We are grateful for the support the Laegelers have shown S&T, and we hope even more individuals will be inspired to contribute.” Borrok says donors could select a specific energy research focus or elect to support the university’s energy research more broadly. Some of S&T’s energy research areas include: Energy storage Researchers are developing energy storage technology by working with new materials, advanced electrode engineering and multiscale modeling to better understand and mitigate battery degradation and aging. Researchers are also studying extreme-fast charging capabilities for electric vehicles and recently demonstrated their efforts to the U.S. Department of Energy. Electric energy The university has ongoing research efforts related to power electronics, transportation electrification, microgrids and renewable energy systems. This research focuses on improving the reliability, efficiency, economics and security of power system operation and planning. Missouri S&T is also home to the Solar Village living laboratories, which have multiple microgrids and student-designed solar houses used for research and demonstrations. Hydrogen generation, storage and use The university’s hydrogen research covers areas such as green hydrogen production through electrolysis, nanostructured catalysts, hydrogen storage materials, hydrogen fueling stations, safety protocols, sensors and fuel cells. Researchers are studying how hydrogen can best be used to achieve the federal government’s goal of net-zero carbon emissions by 2050. Carbon management Faculty and students are conducting research in carbon capture, utilization and storage (CCUS) to reduce harmful emissions and wasted energy. Their efforts encompass carbon capture directly from air or point sources, the mineralization of CO2 to produce carbon-negative concrete or cement supplements, reusing carbon in enhanced oil recovery and producing commodity chemicals and fuels from waste CO2. Research is also being conducted at S&T focused on the decarbonization of the steel and cement industries. Geothermal Recognizing the untapped potential of geothermal energy as a sustainable and renewable resource, S&T’s research teams are working to improve heat recovery efficiency, improve fracture networks through new fracking methods, and control fluid flow loss during drilling with innovative materials. Mining Researchers are developing innovative and sustainable approaches for sourcing critical minerals that support the global energy transition. This research aims to cut down on the environmental impact of mining and processing these minerals by using renewable energy and exploring methods that remove carbon from the process. S&T is leading the charge on developing methods for extracting resources from existing base metal extractions, tailings and other unconventional sources. Nuclear energy Missouri S&T has been home to a nuclear reactor used for research and training since 1961. The university’s researchers study nuclear materials, environmental impact assessments of energy, siting of power plants and spent nuclear fuel, and applying machine learning and digital twins for power plant lifecycles. Research efforts also extend to radioisotopes and thermal hydraulics of cooling systems, investigating irradiation effects in insulators and electronics, developing structural alloys for reactor vessels, and studying irradiation effects on ceramic fuels and moderators. How to help To support this initiative, visit give.mst.edu, select “other” and type in “ETI.” Contact Lara Turek, executive director of development, at [email protected] or 314-971-1101 with any questions. About Missouri University of Science and TechnologyMissouri University of Science and Technology (Missouri S&T) is a STEM-focused research university of over 7,000 students located in Rolla, Missouri. Part of the four-campus University of Missouri System, Missouri S&T offers over 100 degrees in 40 areas of study and is among the nation’s top public universities for salary impact, according to the Wall Street Journal. For more information about Missouri S&T, visit www.mst.edu.Release No.: 105-GE/AS...newswise.com, 16d ago
One of the great mysteries of modern physics is how to reconcile quantum mechanics with the general theory of relativity. The prevailing assumption is that the gravitational field should somehow be quantised, though a number of alternative approaches exist and the fundamental nature of gravity still remains an open question. In the last few years, however, a potential route to resolving this issue has emerged from the field of quantum information. The idea is that certain types of correlations – for example, entanglement – cannot be created between two distinct subsystems if only local (quantum) operations and classical communication (LOCC) are allowed. This suggests that detecting gravity mediated entanglement between two macroscopic scale masses would indicate that the interaction is either quantised or that gravity acts non-locally in the macroscopic limit.CoinGenius, 26d ago
..."We now have, for example, completely new ways of testing the equivalence principle of Einstein, one of the most fundamental assumptions of fundamental physics," says Naceur Gaaloul from the Institute of Quantum Optics at LUH and co-author of the new study. The famous principle holds that gravity affects all objects the same regardless of their mass. It is a principle that many physics teachers will demonstrate by putting a feather and a hammer in a sealed vacuum chamber and showing that, in the absence of air friction, the two fall at the same rate.phys.org, 18d ago

Latest

new There are many scientists and philosophers who share this conviction that the fine-tuning of physics can’t be just a fluke, but who think there is an alternative explanation: the multiverse hypothesis. If there is a huge, perhaps infinite, number of universes, each with different numbers in their physics, then it’s not so improbable that one of them would happen to have the right numbers by chance. And we surely don’t need an explanation of why we happened to be in the fine-tuned universe; after all, we couldn’t have existed in a universe that wasn’t fine-tuned. The latter part of the explanation is known as the ‘anthropic principle’.Aeon, 1d ago
new Their paper, “The internal structure of Eris inferred from its spin and orbit evolution,” recently appeared in the journal Science Advances. The research began while Nimmo was visiting Professor Brown at Caltech and realized that some of his previously-unpublished data could help reveal information about the properties of Eris. At present, we know that Eris is about the same size and mass as Pluto and has a highly eccentric orbit around our Sun, ranging from 38.271 AU at perihelion to 97.457 AU at aphelion. This is almost twice as eccentric as Pluto’s orbit and roughly 50% farther from the Sun.Universe Today, 1d ago
new Simulations show how urban environments offer a unique set of challenges for UAVs due to the turbulent flows generated by infrastructure. Current designs are susceptible to these gusts in specific scenarios and may not have the response times necessary to overcome these flows. While there are exciting possibilities for the development of these technologies, further work must be done to ensure the steadiness of the UAVs and the safety of deliveries and future passengers. The view of Jetson-like flying cars dotting the skies is still found most often in animated films or Star Wars. But that can be changing rapidly so the newer buildings and infrastructure in urban areas should start to look like it can sustain those air taxis and Amazon delivery drones being talked about.Connected World - IoT and Digital Transformation, 2d ago
new Generative AI is a ground-breaking technology, and we are excited to integrate it into Oracle NetSuite soon. The rapid progress in generative AI illustrates that other AI technologies are likely to see significant advancements as well. The future lies in a combination of various AI technologies and business applications. Large Language Models (LLMs) excel in understanding language and data, while traditional AI technologies are proficient in handling numerical data. The goal is to make AI seamless within the user experience. It should assist, advise, and adapt to users’ needs without requiring them to understand the underlying technology. In Oracle NetSuite, we’ve integrated generative AI seamlessly into everyday workflows. Our unique approach leverages the wealth of data within Oracle NetSuite to tailor the AI to the specific context and user requirements. Users can configure AI to match their company’s preferences, aligning with our philosophy of enabling easy configurability for all users.DATAQUEST, 2d ago
new As you know, that view can be rejected as a kind of metaphysical, necessitarian prejudice. The claim is that, in fact, when we look at what science tells us about quantum phenomena and the indeterministic order of things, especially now, in the 21st century, we shouldn’t endorse that picture. What we need is a probabilistic conception of causation rather than a necessitarian one. It’s exactly that wedge that Kane uses as the metaphysical or ontological foundation of this alternative picture. He also uses computing analogies, and suggests that you can have parallel processing systems where the outcome isn’t always the same. The system can be in a seemingly identical state, but different outcomes will issue from it. Same inputs, different outputs.Five Books, 2d ago
new Liquid-crystal (LC) phase modulators are widely used in optical systems because of their advantages of low power consumption, light weight, flexible bandwidth adjustment, and non-mechanical movements. However, most LC phase modulators are polarization-sensitive, meaning that they affect the phase of light differently depending on its polarization. This can limit their performance and functionality in some applications.There are two main approaches to realizing polarization-independent LC phase modulators. The first approach is to use polarization-independent LC materials, such as polymer-stabilized blue-phase liquid crystals (PS-BPLCs). However, PS-BPLCs require high driving voltages, which can make them impractical for some applications.The second approach is to change the alignment of the LC directors. One way to do this is to use a double-layer LC cell, which consists of two LC cells stacked on top of each other with their LC directors oriented orthogonally. This allows light to be decomposed into two orthogonal components, each of which experiences the same phase modulation. However, double-layer LC cells are complex and difficult to manufacture.Another way to achieve polarization-independent LC phase modulation is to use orthogonal photoalignment. This involves using a special photoalignment layer that creates orthogonal alignment domains in the LC. However, it is difficult to achieve precise alignment using this method.In a new paper published in Light: Advanced Manufacturing, a team of scientists led by Professor Jiangang Lu have developed a new approach to polarization-independent LC phase modulation.Polarization-independent LC phase modulation is based on a light-controlled azimuth angle (LCAA) process. The LCAA process uses the optical rotatory effect of cholesteric liquid crystals (CLC) to create single-layer, multi-microdomain, orthogonally twisted (MMOT) structures.MMOT structures are composed of multiple microdomains with orthogonally aligned LC directors. The LCAA process uses a patterned light beam to control the alignment of the LC directors in each microdomain. This allows the researchers to create MMOT structures with precise alignment.LC phase modulators with a single-layer MMOT structure have the potential to be both polarization-independent and have a large phase depth. This makes them ideal for a wide range of applications, including optical communications, wearable devices, and displays.A light-controlled azimuth angle (LCAA) process can be used to fabricate a multi-microdomain orthogonally twisted (MMOT) device with low polarization dependence, high phase retardation, and a simple structure. The alignment angle between the top and bottom substrates in the LCAA process and the mask grid size of the MMOT structure can be tailored to meet the requirements of different applications.This device has the potential to revolutionize the way we use light in a variety of applications. For example, it could be used to create new types of optical communications systems that are more efficient and reliable. It could also be used to develop new types of wearable devices that can display information in a more clear and concise way.newswise.com, 2d ago

Top

We investigate and characterize the emergence of finite-component dissipative phase transitions (DPTs) in nonlinear photon resonators subject to $n$-photon driving and dissipation. Exploiting a semiclassical approach, we derive general results on the occurrence of second-order DPTs in this class of systems. We show that for all odd $n$, no second-order DPT can occur while, for even $n$, the competition between higher-order nonlinearities determines the nature of the criticality and allows for second-order DPTs to emerge only for $n=2$ and $n=4$. As pivotal examples, we study the full quantum dynamics of three- and four-photon driven-dissipative Kerr resonators, confirming the prediction of the semiclassical analysis on the nature of the transitions. The stability of the vacuum and the typical timescales needed to access the different phases are also discussed. We also show a first-order DPT where multiple solutions emerge around zero, low, and high-photon numbers. Our results highlight the crucial role played by $strong$ and $weak$ symmetries in triggering critical behaviors, providing a Liouvillian framework to study the effects of high-order nonlinear processes in driven-dissipative systems, that can be applied to problems in quantum sensing and information processing.CoinGenius, 27d ago
Security of a storage device against a tampering adversary has been a well-studied topic in classical cryptography. Such models give black-box access to an adversary, and the aim is to protect the stored message or abort the protocol if there is any tampering.In this work, we extend the scope of the theory of tamper detection codes against an adversary with quantum capabilities. We consider encoding and decoding schemes that are used to encode a $k$-qubit quantum message $vert mrangle$ to obtain an $n$-qubit quantum codeword $vert {psi_m} rangle$. A quantum codeword $vert {psi_m} rangle$ can be adversarially tampered via a unitary $U$ from some known tampering unitary family $mathcal{U}_{mathsf{Adv}}$ (acting on $mathbb{C}^{2^n}$).Firstly, we initiate the general study of $textit{quantum tamper detection codes}$, which detect if there is any tampering caused by the action of a unitary operator. In case there was no tampering, we would like to output the original message. We show that quantum tamper detection codes exist for any family of unitary operators $mathcal{U}_{mathsf{Adv}}$, such that $vertmathcal{U}_{mathsf{Adv}} vert lt 2^{2^{alpha n}}$ for some constant $alpha in (0,1/6)$; provided that unitary operators are not too close to the identity operator. Quantum tamper detection codes that we construct can be considered to be quantum variants of $textit{classical tamper detection codes}$ studied by Jafargholi and Wichs [’15], which are also known to exist under similar restrictions.Additionally, we show that when the message set $mathcal{M}$ is classical, such a construction can be realized as a $textit{non-malleable code}$ against any $mathcal{U}_{mathsf{Adv}}$ of size up to $2^{2^{alpha n}}$.CoinGenius, 26d ago
Such approaches, anchored in the politics of victimhood and power disparity, disadvantage the organised state. They grapple with the task of countering narratives that are emotionally potent and lean on entrenched stereotypes. However, where emotions run high and narratives define battles, time is of the essence. In the connected era, where information spreads with unparalleled velocity, biases serve as powerful anchors. Societies are fundamentally mega factories of information processing. As the quantum and complexity of information grows, they tend to incline towards familiar narratives to make meaning of their worlds. The magnification of such reduced narratives distorts reality, allowing misinformation and bias to shape opinion. For instance, almost immediately after the bombing of the Al-Ahli hospital, Hamas reported the news in all its gory detail through social media and Gaza-based news organisations while pinning the blame on Israel. By the time Israel presented evidence that implicated a rogue Islamic Jihad rocket, the story was already widespread, and the streets were afire. The Israeli position was dead on arrival, despite later expressions of regret by several news organisations for carrying unverified news. Even bewilderingly, it took Israel close to three weeks to release a video account of the events of 7 October, in a belated attempt to stem the narrative war which was going against it.ORF, 28d ago
new Among the lowest-hanging fruit for AI in government involves press releases and other forms of communication from agencies to residents. Zencity, for example, debuted a ChatGPT tool that writes what amounts to a first draft of a press release — including potential quotes from public officials. That could save significant time for city managers, among other advantages, according to the company.Municipal budgeting, too, could serve as fertile ground for generative AI.A new AI tool from ClearGov takes in past budgeting data and future estimates to produce what officials sometimes call a budget narrative. Such narratives, which put spending figures into context, often help those officials sell the budget to peers and voters. AI could bring more efficiency to the process, usually the most difficult and contentious work undertaken by local and state governments.Generative AI also gained more of a presence in higher education in 2023. The technology can help with essays, math problems and lesson plans, with work completed within seconds. But fears of plagiarism and other abuses have led to a more cautious welcome for AI in universities and colleges than in city halls, with large school districts such as the one in New York City initially placing restrictions on ChatGPT.College deans and local school boards continue to grapple with the full implications of AI. So are other governmental bodies as the technology hogged more of the spotlight as 2023 progressed. Maine imposed a six-month ban on the use of AI for state employees using state devices or conducting state business. Officials said they needed time to study the privacy, bias and misinformation concerns sparked by ongoing deployment of AI-based products. Executive orders started to emerge at a regular clip as fall rolled around, with Pennsylvania, Virginia, Oklahoma and New Jersey governors all issuing guidance within a few weeks of each other. Their missives were followed within weeks by an AI Executive Order from the White House in late October. Each official action recognized both the potential and the risk, with many calling for outside help to develop appropriate policies for safe use in service of their residents.It’s almost impossible, however, to imagine a gov tech future without much more artificial intelligence. Evidence for that comes from every corner of the industry.For instance, industry giant Tyler Technologies touted its growing ability to use AI for quicker and more accurate court filings, whose complex coding and redaction requirements often force judicial employees to perform manual data entry. Klir’s new AI-backed offering is designed to improve water management and compliance, with what the company calls “holistic” views of utility systems delivered via a chatbot fueled by artificial intelligence.Startups, of course, have also embraced AI, as shown by the most recent cohort from CivStart’s gov tech accelerator program, which provides at least some foreshadowing of the tools public agencies might be using a few years from now. One of the program participants is using artificial intelligence to help local officials — many of them new to the grunt work of government — write and manage legislation.GovTech, 2d ago
This marks another major achievement by Professor Huang's team in its 'Super Steel' Project, following the development of the anti-COVID-19 stainless steel in 2021, and ultra-strong and ultra-tough Super Steel in 2017 and 2020 respectively. The new steel developed by the team exhibits high corrosion resistance, enabling its potential application for green hydrogen production from seawater, where a novel sustainable solution is still in the pipeline. The performance of the new steel in salt water electrolyser is comparable to the current industrial practice using Titanium as structural parts to produce hydrogen from desalted seawater or acid, while the cost of the new steel is much cheaper. The discovery has been published in Materials Today in the paper titled "A sequential dual-passivation strategy for designing stainless steel used above water oxidation." The research achievements are currently applying for patents in multiple countries, and two of them has already been granted authorisation. Since its discovery a century ago, stainless steel has always been an important material widely used in corrosive environments. Chromium is an essential element in establishing the corrosion resistance of stainless steel. Passive film is generated through the oxidation of chromium (Cr) and protects stainless steel in natural environments. Unfortunately, this conventional single-passivation mechanism based on Cr has halted further advancement of stainless steel. Owing to the further oxidation of stable Cr2O3 into soluble Cr(VI) species, tranpassive corrosion inevitably occurs in conventional stainless steel at ~1000 mV (saturated calomel electrode, SCE), which is below the potential required for water oxidation at ~1600 mV. 254SMO super stainless steel, for instance, is a benchmark among Cr-based anti-corrosion alloys and has superior pitting resistance in seawater; however, transpassive corrosion limits its application at higher potentials. By using a "sequential dual-passivation" strategy, Professor Huang's research team developed the novel SS-H2 with superior corrosion resistance. In addition to the single Cr2O3-based passive layer, a secondary Mn-based layer forms on the preceding Cr-based layer at ~720 mV. The sequential dual-passivation mechanism prevents the SS-H2 from corrosion in chloride media to an ultra-high potential of 1700 mV. The SS-H2 demonstrates a fundamental breakthrough over conventional stainless steel. "Initially, we did not believe it because the prevailing view is that Mn impairs the corrosion resistance of stainless steel. Mn-based passivation is a counter-intuitive discovery, which cannot be explained by current knowledge in corrosion science. However, when numerous atomic-level results were presented, we were convinced. Beyond being surprised, we cannot wait to exploit the mechanism," said Dr Kaiping Yu, the first author of the article, whose PhD is supervised by Professor Huang. From the initial discovery of the innovative stainless steel to achieving a breakthrough in scientific understanding, and ultimately preparing for the official publication and hopefully its industrial application, the team devoted nearly six years to the work. "Different from the current corrosion community, which mainly focuses on the resistance at natural potentials, we specialises in developing high-potential-resistant alloys. Our strategy overcame the fundamental limitation of conventional stainless steel and established a paradigm for alloy development applicable at high potentials. This breakthrough is exciting and brings new applications." Professor Huang said. At present, for water electrolyser in desalted seawater or acid solutions, expensive Au- or Pt-coated Ti are required for structural components. For instance, the total cost of a 10-megawatt PEM electrolysis tank system in its current stage is approximately HK$17.8 million, with the structural components contributing up to 53% of the overall expense. The breakthrough made by Professor Huang's team makes it possible to replace these expensive structural components with more economically steel. As estimated, the employment of SS-H2 is expected to cut the cost of structural material by about 40 times, demonstrating a great foreground of industrial applications. "From experimental materials to real products, such as meshes and foams, for water electrolysers, there are still challenging tasks at hand. Currently, we have made a big step toward industrialisation. Tons of SS-H2-based wire has been produced in collaboration with a factory from the Mainland. We are moving forward in applying the more economical SS-H2 in hydrogen production from renewable sources," added Professor Huang. Link to the paper: https://www.sciencedirect.com/science/article/abs/pii/S1369702123002390 Please click here for a short video showing how the new stainless steel produces hydrogen in salt water. Hashtag: #HKU...SME Business Daily Media, 17d ago
Newswise — ROLLA, Mo. – Imagine owning a library with every book imaginable — millions and millions of titles — but not having a way to organize the different texts or search for specific information.This is essentially the scenario Missouri University of Science and Technology researcher Dr. Satish Puri works to fix, but instead of it being a brick-and-mortar library filled with books, he is working with petabytes of digital data — primarily geospatial information, such as maps — and finding the best ways to run queries and get useful results as quickly as possible.To put the term “petabyte” into perspective, Puri, an associate professor of computer science, says one petabyte equals 1 million gigabytes. Each gigabyte is equivalent to 1 million kilobytes (KB). The first commercially available floppy disks, which can be considered a precursor to memory cards or USB flash drives, could store 80 KBs.Puri says different types of maps often need to be combined to answer queries, and the map data he works with includes a collection of spatial shapes and location markers.“The amount of spatially referenced data now available may be difficult for some people to fully conceptualize, especially when you consider the available sources, such as smartphones, drones and remote sensing satellites,” he says.But with these massive datasets comes the need to be able to effectively use them, which is why Puri was awarded a grant from the National Science Foundation’s Faculty Early Career Development (CAREER) Program last year. His project, which is expected to receive over $500,000 over the course of five years, is titled: “Communication-efficient and topology-aware designs for geospatial analytics on heterogeneous platforms.”Earlier this year, he was also awarded a $230,000 three-year NSF grant for a project titled “Approximate nearest neighbor similarity search for large polygonal and trajectory datasets.” Both projects focus on geospatial analytics.For these projects, Puri is creating algorithms and using high-performance computing with graphics processing units and smart network interface cards that have advanced processing capabilities and allow searches to be conducted faster.“Just think about some of the basic searches you may do while traveling,” he says. “Your phone can map out driving directions and show the top restaurants in your area, as well as the nearby cities, lakes, roads and different camera views, among other datasets. These basic examples fit with my research, but my projects are even more complex.”Puri says he is developing algorithms that can use data for everything from geophysical trends and solar physics all the way to social issues that can be documented and analyzed in maps.For example, climate scientists could potentially use his developments to more effectively monitor the melting of polar sea ice. During the COVID-19 pandemic, his algorithms could have provided more efficient ways to tie in multiple maps and determine disease hotspots related to the virus and its spread.He says his work will eventually be incorporated into publicly available software for mapping and analytics, and it will most often be used by members of the scientific community and federal agencies.“Working with big data can lead to some big challenges,” he says. “But my research should eventually make a big difference.”For more information about Missouri S&T’s computer science department, visit cs.mst.edu.About Missouri University of Science and TechnologyMissouri University of Science and Technology (Missouri S&T) is a STEM-focused research university of over 7,000 students located in Rolla, Missouri. Part of the four-campus University of Missouri System, Missouri S&T offers over 100 degrees in 40 areas of study and is among the nation’s top public universities for salary impact, according to the Wall Street Journal. For more information about Missouri S&T, visit www.mst.edu.newswise.com, 4d ago

Latest

new Dr. Durrance has an interdisciplinary research and education program in long-term human space exploration. The hazards associated with long-term exposure to the space environment, such as radiation damage and the loss of bone mass, are not sufficiently understood to determine whether they pose acceptable risks or not. Research focused on these hazards is critical to sustained human presence outside the protective environment of the Earth ’s atmosphere and magnetosphere. This program uses the Space Life Sciences Laboratory (SLS Lab) at Kennedy Space Center (KSC); as well as labs at Florida Tech.Lunar dust physics: Enabling technologies must be developed, including systems to mitigate the damaging effects of dust contamination; technologies that use the local planetary resources to produce consumables such as oxygen, water and rocket propellant; food production systems; innovative range technologies and many more. We are currently studying the induction charging characteristics of lunar dust.Bone Loss: The objectives of the bone project are (1) to provide a dynamic model of the structure and function of bone in response to loading with sufficient precision to predict the effect of any arbitrary loading history and (2) to develop and refine new countermeasures against bone loss.Radiation damage: Solar and galactic radiation is a major hazard to space crews during long-duration flights and planetary bases beyond the Earth ’s magnetic field. Intense solar flares can induce acute radiation sickness, galactic cosmic rays can kill brain cells that the body cannot replace, and all forms of radiation can induce cancer. The only known safety measure is shielding to prevent the high-speed particles from reaching the crew.Dr. Durrance is has an interdisciplinary research and education program in astrobiology addressing three fundamental questions: How does life begin and evolve? Does life exist elsewhere in the Universe? What is the future of life on Earth and beyond? Research addressing these questions is highly interdisciplinary involving fields such as physics, biology, chemistry, geology, and planetary science.Extrasolar Planets: Most of the newly discovered planets have been found using indirect techniques, where the planet ’s effect on light emitted from the parent star is detected, not the light emitted by the planet itself. The most likely technique for detecting life on these planets is a detailed analysis of their spectra; therefore, we are developing a system that may be able to detect IR emissions from exoplanets using large, ground-based telescopes along with Very Long Baseline Interferometry (VLBI) techniques developed for radio astronomy.Amyloid Fiber Formation: We are pursuing research that may shed light on a fundamental question regarding the origin of life: how did the transition from non-living to living matter occur? We are investigating the spontaneous formation of long linear fibers from a weak solution of proteins using dielectric spectroscopy. This phenomenon may be important in neurodegenerative diseases and it may help to understand the emergence of ordered biological structures that are far from thermodynamic equilibrium.Dr. Durrance Director of the Sub-Orbital Research and Training Center that utilizes flights of high performance F-104 jets from the Shuttle Landing Facility (SLF) at KSC. In addition to the testing of prototype spaceflight hardware for NASA and commercial companies, we are developing an imaging system designed specifically for imaging Earth ’s coastal regions. The coastal zone is not only the most significant ocean area in terms of productivity, economics, recreation, and natural resources but it is also the most difficult to image.fit.edu, 2d ago
new Brains can imitate, but do so in a fundamentally different way from LLM pretraining. Specifically, after self-supervised pretraining, an LLM outputs exactly the thing that it expects to see. (After RLHF, that is no longer strictly true, but RLHF is just a fine-tuning step, most of the behavioral inclinations are coming from pretraining IMO.) That just doesn’t make sense in a human. When I take actions, I am sending motor commands to my own arms and my own mouth etc. Whereas when I observe another human and do self-supervised learning, my brain is internally computing predictions of upcoming sounds and images etc. These are different, and there isn’t any straightforward way to translate between them. (Cf. here where Owain Evans & Jacob Steinhardt show a picture of a movie frame and ask “what actions are being performed?”) Now, as it happens, humans do often imitate other humans. But other times they don’t. Anyway, insofar as humans-imitating-other-humans happens, it has to happen via a very different and much less direct algorithmic mechanism than how it happens in LLMs. Specifically, humans imitate other humans because they want to, i.e. because of the history of past reinforcement, directly or indirectly. Whereas a pretrained LLM will imitate human text with no RL or “wanting to imitate” at all, that’s just mechanically what it does.alignmentforum.org, 2d ago
new My research interests are varied and I tend not to espouse a specific research agenda because of the diverse nature of our doctoral programs relative to the size of our college. Examples of doctoral research studies I have guided include the following: MATHEMATICS EDUCATION: (1) Examining the effect on student achievement and attitudes of graphing utilities in college algebra courses taught at two-year colleges; (2) Exploring the influence instructional use of calculators in elementary grades has on student performance on the mathematics component of the Florida Comprehensive Assessment Test (FCAT); (3) Applying a conceptual change model, commonly used to assess students' misconceptions of key science concepts to community college mathematics teachers misconceptions of how students learn mathematics; (4) Examining the spacing effect theory (i.e., how variations in the frequency and timing of instruction affect student learning) with respect to 3-, 2-, and 1-day per week schedules in college algebra. SCIENCE EDUCATION: (1) Examining the effect of biology-based virtual and physical field trips relative to students' science achievement and attitudes; (2) Examining the effect of conceptually-based instructional strategies on science achievement and attitudes of community college students in first-semester general biology, microbiology, and human anatomy and physiology courses; (3) Investigating the direct and indirect effects of teacher attributes, classroom attributes, and instructional strategies on Namibian junior secondary school teachers' locus of control, self-efficacy, and attitudes toward desertification; (4) Investigating the effect of student- and teacher-centered instructional strategies with and without conceptual advocacy on ninth-grade biology students' misconceptions, biology achievement, attitudes toward science, and cognitive retention of scientific method and measurement, spontaneous generation, and characteristics of living things. COMPUTER SCIENCE EDUCATION: (1) Examining the effects of a computer-based feedback and assessment environment on Taiwanese students' English language acquisition; (2) Examining the effect of a classroom restructuring involving the introductory course in computer science (CS 1); (3) Examining the perception of control relative to Taiwanese students’ affective domain (locus of control, self-efficacy, test anxiety) in three different types of testing environments: computer-based (CBT), pseudo computerized-adaptive (pseudo-CAT), and pseudo self-adaptive (pseudo-SAT). AVIATION SCIENCE EDUCATION: (1) Developing a causal model to help explain and predict the relationships among various attributes of airport executives that lead to a career in airport management. AVIATION SCIENCES: (1) Identifying factors that contributed to certified flight instructors (CFIs) becoming complacent, which could then be manifested as a lack of or reduced vigilance. (2) Identifying factors related to hazardous events that were precursors to runway incursions classified as pilot deviations. (3) Examining the relationship between factors affecting the aviation profession and the concept of aviation professionalism. (4) Examining the safety climate at targeted U.S. based aviation maintenance, repair, and overhaul (MRO) facilities. (5) Examining the survival strategies of U.S. domestic airlines relative to their route exit/entry decision patterns and air fare competition dynamics.fit.edu, 2d ago

Latest

new Exhibit A is Amara’s law, named for scientist, researcher and former President of the Institute for the Future Roy Amara. He is best known for saying, “We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run.” Boy howdy, does that ever apply to digital technologies in general and artificial intelligence specifically. AI may change the world — curing cancer, reversing climate change or taking all our jobs while creating a bunch more new ones — but mostly in the outbound years and decades. In the near term, it creates shadows for us to worry about, plan around and get distracted by even as we experiment and put it to work.In many respects, GenAI represents a victory lap for Moore’s law, based on Intel co-founder Gordon Moore’s formulation about the exponential growth of computational prowess. Originally coined in 1965 around the doubling of transistors on microchips every couple of years, it bumped up against the physical limits of silicon-based technologies. As AI models grow massively in size, from millions to billions and even trillions of parameters, the underlying hardware continues to keep pace even as margins narrow. Chip maker Nvidia, with its combination of advanced graphics processing units (GPUs) and tensor processing units (TPUs) that are optimized for AI tasks, reaches beyond transistor density in pushing the upward limits in the exponential increase in computational power.Not far behind, but perhaps in a supporting role, is Metcalfe’s law. Named for Ethernet inventor Robert Metcalfe, it suggests that the value of a network is proportional to the square of the number of its users. When the campaign for digital government was young, broadband penetration reached 51 percent — giving advocates the opportunity to claim that government could then serve a “digital majority.” As of last year, that number has reached 90 percent. The stakes are high for the remaining 10 percent, often characterized as underserved communities including low-income and racialized populations along with people who have chosen not to engage in a connected world. By extension of Metcalfe’s law, more user interactions contribute to AI’s knowledge, which relies on iterative feedback for fine-tuning and improvement. The absence of marginalized voices is deafening to generative models. They cannot be trained with what isn’t there, increasing the risk of unconscious bias and the skewing of results.GovTech, 2d ago
new The success of ChatGPT speaks foremost to the power of a good interface. AI has already been part of countless everyday products for well over a decade, from Spotify and Netflix to Facebook and Google Maps. The first version of GPT, the AI model that powers ChatGPT, dates back to 2018. And even OpenAI’s other products, such as DALL-E, did not make the waves that ChatGPT did immediately upon its release. It was the chat-based interface that set off AI’s breakout year.There is something uniquely beguiling about chat. Humans are endowed with language, and conversation is a primary way people interact with each other and infer intelligence. A chat-based interface is a natural mode for interaction and a way for people to experience the “intelligence” of an AI system. The phenomenal success of ChatGPT shows again that user interfaces drive widespread adoption of technology, from the Macintosh to web browsers and the iPhone. Design makes the difference.At the same time, one of the technology’s principal strengths – generating convincing language – makes it well suited for producing false or misleading information. ChatGPT and other generative AI systems make it easier for criminals and propagandists to prey on human vulnerabilities. The potential of the technology to boost fraud and misinformation is one of the key rationales for regulating AI.Amid the real promises and perils of generative AI, the technology has also provided another case study in the power of hype. This year has brought no shortage of articles on how AI is going to transform every aspect of society and how the proliferation of the technology is inevitable.ChatGPT is not the first technology to be hyped as “the next big thing,” but it is perhaps unique in simultaneously being hyped as an existential risk. Numerous tech titans and even some AI researchers have warned about the risk of superintelligent AI systems emerging and wiping out humanity, though I believe that these fears are far-fetched.The media environment favors hype, and the current venture funding climate further fuels AI hype in particular. Playing to people’s hopes and fears is a recipe for anxiety with none of the ingredients for wise decision making.GovTech, 2d ago
new Having recently gone through a process of building that connection between IT and ed tech in Sacramento County, Brooks offered several recommendations for how to do it. One was to begin meetings by focusing on projects and ideas where both sides are in broad agreement, since some ideas will inevitably be contentious. Another was not to underestimate the need for communicating even the simplest things, like the IT department’s direct email and phone number, and to make sure teachers know them.As an example, Brooks mentioned his team’s work with El Centro Jr./Sr. High Juvenile Court School after the pandemic, which entailed building new systems to track, update and standardize personal devices; building a collaborative team with the input and perspective of teachers and administrators at El Centro; hiring an ed-tech coordinator; and training teachers on the 21st-century skills they’re now expected to impart upon students.He said when teachers asked for new Wi-Fi access points at the school that students could use outside of regular hours, conversations with IT led them to realize it wasn’t doable because of the school’s particular restrictions on unsupervised use of its network and devices. But because IT had open lines of communication with the school’s teachers and staff, they understood the problem and it didn’t turn into a fight.Co-leading the session with Brooks, SCOE’s Director of Computer Science and Digital Learning Jared Amalong said one of his strategies for building such relationships is to assume best intentions and remember everyone has a common purpose: to help students.“What I’ve learned through this work is that Andrew and his team are right here alongside our teachers, our paraeducators, our counselors, our administrators, our ed-tech staff, for the same very reason,” he said. “We’re here to do good things for students, and when you bring that mental model, even when those conversations are tough, we’re able to find resolution and what’s good for students.”...GovTech, 2d ago
new KAREN HAO: I think what we've seen is not with, with open source models, I would not describe that as democratizing AI development in the way that I was sort of trying to evoke earlier. The thing about democratizing AI development and governance is that people should be also able to say that they don't want things to be released. So, you know, Meta has taken, they're sort of seen as a, as a bit of a champion around open source AI development. They've taken a stance of we're going to release these models and allow anyone to commercialize off of these models. But no one actually has a say other than Meta about whether they released those models, you know, so that in and of itself I think is undemocratic. And part of the, part of the issue as well with the way that Meta so-called "open sources" its AI models is they allow anyone to take it, download it, and manipulate it, but they don't actually tell anyone how this technology was developed. So one of the things that people have really pushed for heavily-researchers, certain researchers have pushed for heavily-is the fact that Meta or any company could actually open source the data that they use to train these models. You know, you could open source the data, understand far more about what's actually being poured into these technologies and you wouldn't actually accelerate necessarily- it's sudden proliferation everywhere. So one, one of the concerns with the way that Meta behaves that, you know, an OpenAI or other type of company that has a more closed model approach argues is, "Oh look Meta is just accelerating this race, this competitive race and actually creating more dangerous dynamics," sort of to your question of like, does that actually make things worse? And, and I would say like there are actually many ways to increase the democratic governance of these, these technologies and kind of the scientific examination of these technologies without actually accelerating this race such as the data open sourcing. And by doing that you then enable many more scientists to kind of study what actually we're- what are we even feeding these models to then also study how what we feed in relates to what comes out. And then you end up hopefully in a situation through a lot of more scientific research and debate and contestation just better models that don't break that work for more people that are hopefully less computationally intensive and less data intensive so they're not as costly to the environment. And you would end up with- what I think would be more beneficial AI development.Big Think, 2d ago
new The cautious yet optimistic adoption of these technologies by cities like Boston and states like New Jersey and California signals a significant shift in the public-sector landscape.The journey from skepticism to the beginnings of strategic implementation reflects a growing recognition of the transformative potential of AI for public good. From enhancing public engagement through sentiment analysis and accessibility to optimizing government operations and cybersecurity, generative AI is not just an auxiliary tool but a catalyst for a more efficient, inclusive and responsive government.However, this journey is not without its challenges. The need for transparent and accountable technologies, responsible usage, constant vigilance against potential misuse, and the importance of maintaining a human-centric approach in policymaking are reminders that technology is a tool to augment human capabilities, not replace them.With responsible experimentation and a commitment to continuous learning, governments can harness the power of generative AI to reshape how they deliver public services. The future of governance is being rewritten, and it's up to us to ensure that this story is one of progress, inclusivity and enhanced public welfare.Beth Simone Noveck is a professor at Northeastern University, where she directs the Burnes Center for Social Change and its partner projects, the GovLab and InnovateUS. She is core faculty at the Institute for Experiential AI. Beth also serves as chief innovation officer for the state of New Jersey. Beth’s work focuses on using AI to reimagine participatory democracy and strengthen governance, and she has spent her career helping institutions incorporate more participatory and open ways of working.GovTech, 2d ago
ASU has a history of rapidly growing its research capacity, helping to alleviate the effects of climate change as well as advancing new scientific fields. In FY22, ASU’s research enterprise continued its commitment to exploration — from the Earth’s oceans to the hypothesized former oceans of Mars. ASU’s rapid bioscience expansion resulted in new treatments for children with autism. And with the nation’s largest cohort of engineering students, ASU has taken on the challenge of helping the U.S. to regain the leading edge of microelectronics research and manufacturing in producing the chips that are ubiquitous in today’s smartphones, cars and appliances.ASU News, 3d ago

Latest

BESS come in different shapes and sizes, from small batteries connected to rooftop solar arrays as a backup power source, to large systems that can provide electricity to hundreds of homes and businesses. Some rely on different chemistries, too, but the purpose of each is generally the same: To store large amounts of energy.The basic technologies used in most of today’s battery systems have been around for decades, experts say. What has changed is the demand, especially for large installations.As of the middle of this year, Georgia had an estimated 5,200 megawatts of solar capacity installed statewide — enough to power roughly 626,000 homes — nearly all in the form of utility-scale arrays that cover hundreds of acres. Those facilities produce huge amounts of energy, but it is intermittent — on cloudy days and during overnight hours, generation drops off. At other times, when the sun is shining, a solar facility’s electricity output can exceed demand.Enter BESS. The battery systems can be charged with extra solar electricity at times of low usage and then pushed onto the grid instantly when demand spikes.Matthew McDowell, the co-director of Georgia Tech’s Advanced Battery Center, said it’s a “no-brainer” for Georgia utilities to add more BESS.“Solar panels work really well in Georgia, and combining solar panels and batteries at grid-scale makes a lot of sense,” McDowell said.In a clearing 30 minutes outside Columbus, Georgia Power is almost finished installing what it says will be the state’s largest battery storage facility yet, a 65-megawatt system of lithium-ion batteries. It is expected to come online in the first half of 2024.The technology isn’t much to look at: The battery stacks, that sit on concrete slabs, are housed inside stark, gray metal enclosures. Inside, black batteries resembling the ones found under the hood of a car are stacked on trays. A large air conditioner is built into the door of each container to keep the batteries from overheating.This is the first large battery system Georgia Power is adding, but many more are on the way. Last year, the Georgia Public Service Commission approved the company’s plans to add 765 more megawatts of BESS to its system. And in a recent update to its plans filed with the commission, Georgia Power is seeking to add another 1,000 megawatts of batteries — equal to the maximum output of one of Plant Vogtle’s new reactors — by the end of 2027, to meet the increased demand the company says it expects as Georgia’s economy booms.On a recent visit to the site, Aaron Mitchell, Georgia Power’s vice president of pricing and planning, said the company sees batteries as a valuable asset. In addition to pairing nicely with solar, he said the batteries can provide extra capacity to help meet surges in demand, like during a deep freeze. Texas, which suffered crippling blackouts during a deadly winter storm in 2021, has since added thousands of megawatts of batteries to its system, helping the state avoid a similar outcome in recent heatwaves.“This is just the beginning,” Mitchell said.Georgia Power isn’t the only electricity provider adding batteries in the state. With the help of new federal funding, Oglethorpe Power plans to add three large battery storage systems, each with a capacity of 25 megawatts, to its fleet around the metro Atlanta area.Meanwhile, at least one local company is working to develop the next generation of BESS.Alpharetta-based Stryten Energy is working to advance its vanadium redox flow batteries (VRFB), a type of BESS that uses a different electrochemistry to store energy. Unlike lithium-ion batteries, VRFBs use a liquid electrolyte with ions of the chemical element vanadium dissolved in the solution to hold a charge.Stryten is also collaborating with Georgia Tech to advance its vanadium batteries, a project that’s being led by Tech assistant professor Nian Liu.VRFB has efficiency and price hurdles to address, but McDowell — who is also participating in the research with Stryten — said they also have some advantages over lithium-ion, especially for large applications. VRFB batteries do not present a fire risk, their energy storage capacity does not degrade over time, and the tanks of liquid can be stacked on top of each other, unlocking the potential for much bigger battery systems to be built.This summer, Stryten partnered with Snapping Shoals EMC to install its first VRFB demonstration in Georgia near Covington. Mike Judd, the president and CEO of Stryten, said he hopes the Snapping Shoals project will help the company further commercialize its proprietary technology.“We want them (Snapping Shoals) to beat it up and give us feedback on it, because we’re just going to take the one that they have and scale it bigger and bigger and bigger.”© 2023 The Atlanta Journal-Constitution. Distributed by Tribune Content Agency, LLC.GovTech, 3d ago
These projects are distinct from one another in experimental design. They share the potential, however, to revolutionize what is possible in the care of extremely preterm babies. Existing forms of neonatal care are emergency interventions. The baby is given treatments to stave off the effects of being born with significantly underdeveloped organs. The artificial womb, in contrast, extends the period of gestation to prevent these complications from arising to begin with. If it works, it will enable the infant to keep growing as though it had not yet been born. And with scientists anticipating human trials within the next few years, artificial-womb technology is no longer purely speculative. The “before” and “after” images released by the biobag team were eerie and briefly ubiquitous. In the first, a floating, pink-skinned, wrinkled lamb fetus sleeps adrift in a transparent bag. In the second, it has grown soft white wool and its body presses against the plastic surface, waiting to be born. These pictures evoke much the same reaction that people once felt when they first encountered incubators: the curious sensation of peering into the future.The Walrus, 3d ago
Newswise — Microorganisms, such as bacteria and sperm, can undergo adaptive shape morphing to optimize their locomotion mechanisms in the environment, which enables them to navigate complex barriers and improve survival. Inspired by this autonomous behavior, artificial reconfigurable microrobots have been proposed to realize similar adaptation capabilities.Helical microswimmers are particularly promising for biomedical applications because of their unique propulsion mechanism. Under a rotating magnetic field, helical microswimmers can transition dynamically between tumbling and corkscrewing motions, allowing them to navigate complex terrain and achieve targeted drug delivery.In a new paper published in Light: Applied Manufacturing, a team led by Professor Jiawen Li from the University of Science and Technology of China has developed new ways to make helical microswimmers that are very small, with dimensions in the micrometer range. This is important because many biological structures, such as cells, capillaries, and wrinkles on cell surfaces, are also very small.Microscale helical microswimmers are promising tools for biomedical applications, such as drug delivery and targeted therapy. However, some challenges still need to be addressed before these microswimmers can be widely used in clinical practice.One challenge is the fabrication of microscale helical microswimmers. Femtosecond direct laser writing (fs-DLW) is a powerful tool for fabricating complex 3D structures with high resolution, but it is a slow and inefficient process. This makes it difficult to produce large quantities of microswimmers quickly.Another challenge is the adaptive locomotion of microswimmers in complex environments. Inside the body, pH values vary among different tissues, and numerous microchannels and obstacles exist. Microswimmers need to be able to sense their environment and adapt their locomotion accordingly to reach their target destination.Researchers are addressing these challenges. One approach is to develop new fabrication methods that are more efficient and scalable than fs-DLW. Another approach is to design microswimmers that are more responsive to environmental stimuli and can adapt their locomotion accordingly.In this study, researchers developed a new method for fabricating pH-responsive helical hydrogel microswimmers. This method is called rotary holographic processing and is much faster than traditional methods. It can produce microswimmers in less than a second, approximately one hundred times faster than the point-by-point scanning strategy.The microswimmers are made of hydrogel, which is a type of material that can absorb water. This makes the microswimmers responsive to pH changes. When the pH of the surrounding environment changes, the microswimmers change shape. This shape change allows the microswimmers to move in different ways.Under a constant rotating magnetic field, the microswimmers can tumble or corkscrew. The type of motion depends on the pH of the environment. In a low pH environment, the microswimmers contract and tumble. In a high pH environment, the microswimmers expand and corkscrew.The researchers assessed the microswimmers in various conditions and found that they could traverse complex terrains and deliver drugs to targeted cells. This suggests that rotary holographic processing is a promising method for fabricating microswimmers for biomedical applications.This research suggests that rotary holographic processing is a promising new method for fabricating microswimmers for biomedical applications. The microswimmers are fast, reconfigurable, and able to navigate complex environments. This makes them ideal for targeted drug delivery and cell therapy tasks.newswise.com, 3d ago
OpenAI’s definition of the term, in fact, has been somewhat flexible. The company, whose stated goal is to create AGI, defines artificial general intelligence in its charter (published in 2018) as “highly autonomous systems that outperform humans at most economically valuable work—benefits all of humanity.” But OpenAI’s CEO Sam Altman has more recently defined AGI as “AI systems that are generally smarter than humans,” a seemingly lower bar to hit.Hype can fuel interest and investment in a technology but it can also create a bubble of expectations that, when unmet, eventually bursts. That’s perhaps the biggest risk to the current AI boom. Some very good things might result from advances in generative AI, but it will take time.Fast Company, 3d ago
Commitment to volunteer work by HSG students is uniqueA long partnership: Eastern Switzerland's textile industry and its universityChoix Goncourt de la Suisse: HSG students take part in France's most important literary competitionHSG work researcher Hans Rusinek combines sustainability and New WorkThe first cohort of the joint Executive MBA ETH HSG (emba X) successfully graduates20th SME Day in St.Gallen: Turning negative momentum into an opportunityPersonality in Residence Thomas Zurbuchen: "The universe is full of surprises"HSG presents photo exhibition: "100 Years Akris – 125 Years University of St.Gallen"Miriam Meckel on the use of artificial intelligence in scienceHSG Community Festival draws 1,000 alumni back to their alma materHSG Master's Celebration: Ambition as Inner DriveHSG Bachelor Celebration: The "Beat of Life"Success in entrepreneurship lies in the balance between intuition and rationalityTEDxHSG: Ideas for exploring the future of learningInternational Symposium: Many questions and a few answers about the future of universities55 New Doctors at HSG44 Propositions on the Future of UniversitiesTwo HSG spin-offs in the top 3 of the 100 best Swiss start-upsStudent theses: HSG authors want to tap into unused potentialEuropean Student Orienteering Championships"DreamTeam": HSG Mentoring Programme that inspiresPlayfully Healthy: HSG Short Film "Healthification" receives Gold AwardCrux of Capitalism InitiativeIs sustainable investing a dangerous placeboJourney to Odessa – Professor James Davis travels to UkraineSQUARE ist komplett mit nachhaltiger Energie versorgtSurvey among Generation Z: Does hybrid working boost careers?15 years of diversity & inclusion research at the HSGHSG research on Nigerian oil spills: Commission report recommends 12Billion US dollars for clean-upDies academicus, the highlight of 125 years HSG celebrationsThree research projects honoured with the "HSG Impact Award 2023"“Swiss Mobility Monitor 2023”: Mobility behaviour from baby boomers to Gen Z52nd St.Gallen Symposium: Elliot Gunn wins this year´s Global Essay CompetitionHöhepunkt des Jubiläumsjahres: «Dies academicus 2023»“St. Gallen Helps Ukraine” initiative organises fund-raising event"Swiss Youth in Science" at SQUARE – when young people pursue their scientific dreamsElite Quality Index 2023 sees new global leaderMaster Graduation Day: 498 graduates received their diplomasBachelor Graduation Day: 370 graduates received their diplomasFood Security: wheat exports as a weaponHoher Besuch im SQUARE: Bundespräsident Alain Berset spricht über den Krieg in der Ukraine und die Haltung der SchweizSQUARE celebrates its first birthday with an open day52 doctorates awarded at HSGOpen House: HSG invites you to a voyage of discovery at Open SQUAREThe working world: Offices that makes people work together, optimize hybrid work well...HSG - Uni SG, 3d ago
Newswise — Photonic integrated circuits (PICs) are compact devices that combine multiple optical components on a single chip. They have a wide range of applications in communications, ranging, sensing, computing, spectroscopy, and quantum technology. PICs are now manufactured using mature semiconductor fabrication technologies. It has reduced costs and improved performance. This makes PICs a promising technology for a variety of applications.Photonic packaging is much more challenging than electronic packaging. PICs require much higher alignment accuracy, typically at the micron or even submicron level. This is because the optical modes of PICs need to be precisely matched.The tight alignment tolerance of PICs makes them incompatible with mainstream electronic packaging techniques and infrastructure. Additionally, the increasing demand for heterogeneous or hybrid integration of multiple material platforms (such as silicon III-V and lithium niobite) is another challenge for photonic packaging. New packaging technologies and device architectures are needed to address these challenges.In a new paper published in Light Advanced Manufacturing, a team of scientists led by Dr. Shaoliang Yu and Qingyang Du have developed new packaging technologies.Two-photon lithography (TPL) is a laser-based technology that can be used to create 3D structures with very high resolution. It has recently emerged as a promising approach for photonic packaging, which is the process of assembling and connecting photonic components into a single system.TPL offers several unique advantages for photonic packaging. TPL can be used to create various 3D photonic structures, such as beam shapers and mode transformers. This is important for achieving high coupling efficiency and wide bandwidths when connecting different optical components in a system.It can also form optical connections between photonic components after assembly. This is because the shape of the connections can be customized according to the relative displacement between the components. This relaxes the alignment tolerance during PIC assembly and enables the use of standard electronic assembly techniques.TPL can create high-channel-density, low-loss 2.5-D or 3-D links to accommodate the height differences between the optical ports inside a package. This is particularly important for hybrid integration, in which modules are patterned on different substrates with varying thicknesses.TPL can be used to form micro- and nano-mechanical structures to guide precise component placement in a passive alignment process or pluggable optical connectors.In addition to these advantages, TPL resins are typically broadband and low in optical attenuation, making them suitable for building low-loss optical links between dissimilar material platforms.Overall, TPL is a versatile and powerful technology for photonic packaging. It offers several unique advantages that can help to address the challenges of packaging PICs, such as the tight alignment tolerance and the need for heterogeneous or hybrid integration. As the photonics industry increasingly adopts TPL, further research and development efforts are underway to boost TPL fabrication throughput, expand the material repertoire, and develop new design and characterization tools.newswise.com, 3d ago

Latest

Dual-use technologies are of critical importance to national security due to their applications in both civilian and military contexts. The strategic competition between the U.S. and China is centered around these cutting-edge technologies, particularly in the semiconductor chip sector. In recent years, Washington has tightened export controls against China to impede its technological progress. For example, in October 2022, the Bureau of Industry and Security issued a rule that restricts China’s access to advanced semiconductors and manufacturing tools. However, in today’s globalized world, no single country possesses a complete ecosystem for chip production. The U.S. excels in chip design software and research and development, South Korea and Taiwan lead in manufacturing the most advanced chips with sizes below 10 nanometers, and Japan specializes in manufacturing materials. Thus, the U.S. alone cannot effectively contain China, and collaboration among these three tech leaders is crucial in shaping the future trajectory of technological competition between China and the U.S.China-US Focus, 3d ago
Schmalian is one of the main authors of the study published in Science. In 2022 already did researchers from several institutes of KIT and MPI CPfS publish an article in Nature. They reported that mechanical pressure in a certain direction considerably increased the transition temperature of strontium ruthenate and, as a result, changed the excitation behavior of electrons. In cooperation with international partners, the researchers from Karlsruhe and Dresden have now found that this pressure that increases superconductivity makes the materials much softer and facilitates deformation. The researchers attributed this to quantum mechanics resonance of the electron oscillations.newswise.com, 3d ago
Newswise — For many decades, the rock mechanics community has tacitly assumed that a rock mass can be equated to the sum of fractures and intact rocks. Accordingly, it is believed that the behaviour of a rock mass can be understood by decomposing it into smaller pieces and characterising these pieces completely. However, from the statistical physics point of view, this commonly assumed equation, i.e. rock mass = fractures + intact rocks, is incorrect, or at least incomplete.“Rock mass is a complex system formed by numerous fractures and rocks that interact with each other across spatiotemporal scales,” explains Qinghua Lei, the sole author of a new study published in Rocks Mechanics Bulletin. “In such a complex system, entirely new properties could emerge at a higher level arising from the collective behaviour of constituent components at the lower level, such that the system exhibits properties that its parts do not have on their own, for which reductionism breaks down. So, more is different!”Dr. Lei, an Associate Professor at Uppsala University and a former Senior Researcher & Lecturer at ETH Zurich, further explains, “Consequently, the large-scale behaviour of a rock mass cannot be predicted by simple applications of the knowledge of small-scale core samples, due to the hierarchy of scales, heterogeneities, and physical mechanisms as well as the possible emergence of qualitatively different macroscopic phenomena.”Based on a combined statistical physics and rock mechanics perspective, Dr. Lei presented a thorough discussion on the mechanisms of emergence in fractured rocks. Additionally, he proposed a multiscale conceptual framework to link microscopic responses of individual fractures/rocks to the macroscopic behaviour of rock masses, which consist of a large number of fractures and rocks.“This multiscale framework can serve as a useful tool to bridge experimentally established constitutive relationships of fracture/rock samples at the laboratory scale to phenomenologically observed macroscopic properties of fractured rock masses at the site scale,” Dr. Lei concludes.newswise.com, 3d ago

Top

There are a few projects that have engaged with cloud computing in the blockchain space, like Siacoin, Storj, Maidsafecoin, and Golem. But what makes iExec (RLC) really stand out of the crowd? Well, unlike the listed examples that mostly focus on providing decentralized storage, iExec (RLC) steps it up a notch, by not only providing cloud storage but also delivering processing power to blockchain applications, otherwise known as Decentralized Applications (dApps). Yes, you read that right! This means iExec is built to provide both the hardware and software required for off-chain computations necessary for dApp functioning. That's like hitting two birds with one stone, isn't it? What's more? iExec is designed to target the entirety of the blockchain ecosystem. This means they're not just narrowly focusing on certain sections of the market, but rather offering support to all dApps irrespective of the industries they represent. An all-inclusive cloud service offering in a historical, technologically advanced blockchain industry? Now that is something you don’t come across everyday! Drawing comparison from the conventional cloud computing giants like Amazon S3 and Microsoft Azure, what sets iExec apart is its groundbreaking leverage of the decentralized nature of blockchain technology. This enables the iExec platform to distribute its cloud resources amongst thousands of computational nodes globally, lending towards an efficient off-chain solution for the execution of tasks necessary for dApp operations, a part which simply cannot be achieved via the prevalent centralized cloud computing systems. The brilliance behind this groundbreaking design comes from a team of PhDs well-acclaimed in the cloud-computing space, who have tirelessly poured in their knowledge, expertise, and passion. Simply put, iExec is the trailblazer of distributed cloud computing aiming to redefine how we perceive and employ cloud resources today. Now, isn't that a catch? It surely is for me! Stick around as next time, I’ll be exploring how iExec (RLC) operates, with regard to how it can support the computing power needs of dApps and smart contracts, the application of the XtremWeb-HEP, fault tolerance, security, and accountability. Exciting times ahead!...CryptoLinks, 11d ago
Violence is an inescapable through-line across the experiences of institutional residents regardless of facility type, historical period, regional location, government or staff in power, or type of population.Population Control explores the relational conditions that give rise to institutional violence - whether in residential schools, internment camps, or correctional or psychiatric facilities. This violence is not dependent on any particular space, but on underlying patterns of institutionalization that can spill over into community settings even as Canada closes many of its large-scale facilities. Contributors to the collection argue that there is a logic across community settings that claim to provide care for unruly populations: a logic of institutional violence, which involves a deep entanglement of both loathing and care. This loathing signals a devaluation of the institutionalized and leaves certain populations vulnerable to state intervention under the guise of care. When that offer of care is polluted by loathing, however, there comes along with it an unavoidable and socially prescribed violence.Offering a series of case studies in the Canadian context - from historical asylums and laundries for “fallen women” to contemporary prisons, group homes, and emergency shelters - Population Control understands institutional violence as a unique and predictable social phenomenon, and makes inroads toward preventing its reoccurrence.mqup.ca, 19d ago
Providing research testing services to both internal and external stakeholders is an integral function of the Renewable Bioproducts Institute (RBI). These services include chemical analysis; corrosion; paper, board and box testing; pulp analysis; and pulp recovery analysis. Established over 25 years ago, RBI’s testing services are well-known in the industry for their quality and customer service. RBI is one of the ten interdisciplinary research institutes at Georgia Tech that champions innovation in converting biomass into value-added products, developing advanced chemical and bio-based refining technologies, and advancing excellence in manufacturing processes.The RBI research testing services is a team of professional scientists and engineers who work together to provide information and offer solutions required by a manufacturers and users of biomass products, as well as Georgia Tech faculty and students engaged in research on campus. The multidisciplinary capabilities of the team make them uniquely qualified to address customers' technical needs in the areas of process and product development, and quality control. Where appropriate, the team involves RBI faculty and other staff experts to arrive at the best possible solution for their customers and users.In this article, we will focus on a day’s work with the chemical analysis team. Headed by Rallming Yang, senior research scientist at RBI, the team is equipped to follow the Technical Association of the Paper and Pulp industry (TAPPI) standard of testing, which only a small number of labs in the country can do, and has also developed some of its own internal protocols. Yang leads two specific characterization programs within RBI: (1) the pulping and bleaching analysis, paper recycling, and recovery lab, and (2) the chemical analysis lab.The chemical analysis team is busy year-round with research projects and testing services. In addition, during the Spring semester, the team also provides support to a paper science laboratory course for undergraduate and graduate students. In the recent times, chemical analysis of black liquor from pulp mills has kept the team busy with more than 30 projects completed by the team over three months for various industry customers. Currently, black liquor analysis continues to account for over 50% of the workload of the lab.Black Liquor (BL) is a byproduct of a wood pulping and is released when cellulose fibers are separated from wood chips. BL contains lignin, which is used as a biofuel within the mill, and several other chemicals that are recovered and reused. In most pulp mills, nearly 50-70% of BL is converted into a convenient source of fuel or energy. Due to the important role played by black liquor in a paper mill, it needs to be tested regularly to ensure consistency in composition. The RBI chemical analysis lab gets BL samples from a pulp mill, who contact the lab by email to get their testing request into the queue. The process involved in the testing is very intense and has multiple steps that need to be carefully administered.In the first step, inorganic elements in BL are identified by digesting it in a precise mixture of acids and filtering the mixture. The filtrate is introduced into an Inductively Coupled Plasma (ICP) Emission Spectrometer that can identify more than 70 different inorganic elements and compounds like sulfur, potassium, sodium, iron, calcium, etc. The next step involves identifying the proportion of anions like sulfate, chloride, thiosulfate. In this step, BL is diluted to a specific level and analyzed using a method called Capillary Ion Electrophoresis (CIE).The next step involves analyzing BL for organic substances using two methods – gas chromatography mass spectrometry (GC/MS) and Fourier Transform Infrared Spectrometry (FTIR). For organic substances with a lower molecular weight of less than 600 Daltons (Da), GC/MS is employed where the gas chromatography separates the chemical mixture, and the mass spectrometry identifies each of the components.The final step is to identify organic substances and polymers with higher molecular weights. For example, lignin is one of the main polymers in BL with a molecular weight higher than 600 Da. FTIR is used for testing during this step. Based on vibrations within each molecule, an FTIR spectrum allows identification of molecular groups within lignin. The equipment then uses a computer to identify the substances by comparing the sample spectrum with a built-in library. The RBI team provides detailed lab reports that is used by the pulp mill to adjust their operating parameters for trouble-free operations.In addition to the chemical analysis of byproducts like black liquor and other chemical compounds, Rallming Yang’s team also conducts studies on pulping and bleaching, repulping, and fiber characterizations.gatech.edu, 25d ago
Following aesthetics, HyperTone Camera Systems will usher in a new era in computational photography. The systems will initially arrive in future Find series flagships, providing users with an unprecedented mobile imaging experience beyond imagination. OPPO Imaging Director Oliver Zhang said: "Thanks to the HyperTone Camera System which follows aesthetics, the Find X6 Series is positively received by both the public and professionals. With Find N3 Series, we also brought flagship-level imaging experience for foldables for the first time. I am delighted that our collaboration with Hasselblad has now entered a new phase. By combining classic imaging aesthetics with mobile technology, OPPO and Hasselblad have developed an aesthetic system that is tailored for mobile imaging. In 2024, the two companies will launch the next generation HyperTone Camera Systems and Hasselblad camera experience that promise to continue to set new industry standards." Bronius Rudnickas, Hasselblad Global Marketing Manager, said: "We are very pleased to see OPPO's latest advancements in mobile imaging, which allow people to use their smartphone to bring their creativity to life through photography. Our continued collaboration to build the next-generation HyperTone Camera Systems is a huge project, that is not just a simple inheritance of style. Instead, it is a more organic and in-depth integration of the aesthetics and technology between both brands that will provide photography enthusiasts with enhanced image quality and stylized experiences, beyond what's achievable on typical mobile phones. I am looking forward to sharing the results of our collaboration with all the creators next year." Following aesthetics, OPPO is leading a revolution in mobile imaging As a pioneer in mobile imaging, OPPO is leading a revolution by following aesthetics rather than relying on industry traditions of certain technical specifications. This new approach allows the objective technical process to serve the subjective aesthetic of users. The Photography Aesthetics Feature Quantification Lab is made up of 10% photography technicians, 50% professional photographers, and 40% color experts. Through feature breakdown, solutions planning, and precise tuning, OPPO has created a new photography technology development system, ushering in a new era in computational photography. HyperTone Camera System OPPO's HyperTone Camera System consists of HyperTone All Main Camera System, HyperTone Image Engine, and HyperTone ProXDR Display, to deliver end-to-end system-level imaging capabilities. In addition to Find series flagships, the technical architecture of HyperTone Camera System will empower multiple OPPO product series in the future, for flagship imaging experience with consistent aesthetics and exceptional quality. To meet users' mobile imaging needs, HyperTone Camera System offers the first-ever All Main Camera System in the industry. It enables high image qualities for every focal length whether it's day or night. In the Find N3, OPPO brings the stack pixel technology sensor to foldable phones for the first time, achieving performance similar to a one-inch sensor that can be fitted into a foldable. Following aesthetics, HyperTone Image Engine solves many issues in traditional computational photography, achieving fewer digital artifacts with more computation. Extra HD Algorithm, for example, improves clarity by 30% while reducing noise by 60% by using AI RAW fusion. The new algorithm requires 400% more computing power, but it results in cleaner and clearer details after processing. Photo viewing is an indispensable part of a great imaging experience. HyperTone Camera System comes with the industry-leading HyperTone ProXDR Display. By precisely recording the brightness of 12 million pixels, ProXDR technology can unlock up to eight times more dynamic range on-screen, restoring the natural tone to make viewing photos much more realistic. Additionally, OPPO is also working to make ProXDR compatible with the Ultra HDR standard. Hasselblad Portrait Mode Portrait is not only one of the most popular camera features, but it also represents the most cutting-edge innovation of computational photography. Following aesthetics, Hasselblad Portrait Mode provides portrait photography effects that lead the industry. Through the HyperTone Engine, Hasselblad Portrait Mode initiates delicate tonal changes to create a three-dimensional portrait that avoids brightening the face too much and smoothing out light and shadow. By optimizing with 45 times more skin tone colors, Hasselblad Portrait Mode offers more accurate skin tones. And with Bokeh Flare Portrait, users can get a cinematic bokeh effect that truly simulates the optical out-of-focus effect. Work shot using OPPO phones at Paris Photo In 2023, OPPO launched the imagine IF Project to break through the existing limitations of mobile imaging with global users. As a key part of the project, the OPPO imagine IF Photography Awards 2023 has received over 700,000 submissions from 51 countries and regions. The 26th Paris Photo event will be held in Paris, France between November 9-14, 2023. As the only smartphone brand participating in Paris Photo this year, OPPO presented a series of exceptional images, including the works of world-class photographers and the winning works from the OPPO imagine IF Photography Awards 2023. OPPO's presence at Paris Photo serves as a bridge for fostering cross-cultural communication, revealing new possibilities of mobile imaging to photographers, collectors and artists worldwide. Norwegian photography artist and Hasselblad Master, Tina Signesdottir Hult, acclaimed photography artist and Hasselblad master, Wang Jianjun, renowned fashion photographer and Hasselblad Ambassador, Yin Chao, OPPO imagine IF Photography Awards 2023 winner, Ahei Huan, as well as many other photographers, will attend the festival to witness the first presentation of their work shot using OPPO phones at Paris Photo. Hashtag: #OPPO...SME Business Daily Media, 25d ago
The 1950s saw a post-war experimental biology renaissance. One of the earlier papers from this era – August Krogh's final contribution to the journal, published with a young Torkel Weis-Fogh – reported the first measurements of metabolic rate and respiratory quotient in a flying desert locust, Schistocerca gregaria (Krogh and Weis-Fogh, 1951), while Rupert Billingham, with Medawar, laid down the basic procedures and principles of skin transplantation in their seminal paper, ‘The technique of free skin grafting in mammals’ (Billingham and Medawar, 1951). Revisiting the article, Jeremy Santa Ono (University of Michigan, USA) said, ‘Both clinicians and scientists still turn to the paper to understand fundamental concepts in dermatology and as a primer to skin grafting in mammals’ (Ono, 2004). By the middle of the decade, Gray switched his intellectual focus from the macroscopic motion of snakes and eels – to the microscopic world of spermatozoa, teaming up with aeronautical engineer G. J. Hancock (Queen Mary College, London, UK), to explain how the beating flagellum propels the miniscule gametes (Gray and Hancock, 1955). Gathering more than 40 Google Scholar citations in 2023, the paper continues to inspire and contribute to the nanotechnology revolution almost 70 years after Hancock derived the basic mathematics of flagellar propulsion. Three years later, University of Cambridge scientists Hans Lissmann and Ken Machin revealed a previously unidentified sense when they discovered that weakly electric fish can locate objects through the distortions in the electric fields they produce (Lissmann and Machin, 1958). ‘The story told in Cambridge was that the first clue to the electric sense had come when a student combed her hair near [the fish's] tank, and the Gymnarchus went wild’, McNeill Alexander reminisced (Alexander, 2006). Regardless of whether the tale is truth or myth, together the pair revealed and explained a novel sense ‘unlike anything we humans can experience’ (Alexander, 2006).The Company of Biologists, 3d ago
While in high school, Taylor attended Illinois Institute of Technology’s Early Identification Program on Saturdays. She learned about engineering by building “egg-mobiles” and parlayed that into attending numerous science, technology, engineering and math (STEM) programs during her summers.Taylor went to Purdue University in Indiana for her undergraduate and graduate degrees. She earned her Ph.D. in electrical engineering and computer sciences at the University of California, Berkeley. She was a faculty member at Northwestern University and Texas A&M before joining Argonne.Throughout Taylor’s career she has been encouraging and supporting others in STEM. She co-founded the Center for Minorities and People with Disabilities in IT in Lemont. The nonprofit hosts an annual Tapia Conference, where computer scientists from underrepresented communities share research, network, and find mentorship. The event that has grown from 160 attendees in 2001 to 2,000 this year.“We started the conference because oftentimes you go to conferences being from an underrepresented community, and do what I call ‘the scan,’ (looking for) others who look like me,” Taylor said. “I said: Let’s have a conference in computing to bring everybody together to celebrate diversity and all you have to do is look to your neighbor to see people who look like you, see speakers who look like you and you have community. You may go back to your institution and be the only, but while you’re sitting in a session at the Tapia Conference, you are far from the only. ... It’s about building community, making networks, but also staying technical.”As part of Wednesday’s National STEM Day celebration, which encourages youths to explore their interests in STEM, we spoke with Taylor about diversity in the field, science fiction and artificial intelligence. The following conversation has been edited for length and clarity.Q: There’s been a major push for more diversity in STEM. Are you seeing more diversity in real time?A: The National Science Foundation published a report on diversity in STEM, and they were showing that compared to a decade ago, the numbers are increasing. However, in terms of the overall field and percentages, there’s still a large gap. When you start to look at the population, and then you look at those in the field, there’s a really big gap in the numbers, in terms of ethnicity, and in terms of gender as well.Q: How can we get that gap to be smaller?A: A large part has to do with the environment. For example, you go from high school to college, and a question that comes up is: Are your classes welcoming? Or are there assumptions being made when you walk into the classroom? What happens in the classroom where you’re the only? And the numbers go down as you go to higher degrees. I went through that, being the only Black woman in the class. You look around ... and have to specify “I’m not here to take notes.” After I got the Ph.D., I went somewhere with a grad student of mine and they thought the grad student was the professor. ... I enjoy my job at Argonne National Labs, but you have places where it’s the mindset of people that have to change and that takes a long time to change. And sometimes it’s hard to know if it’s really changed.Q: What’s been your most memorable experience in the field so far?A: When I finished my Ph.D., I was faculty at Northwestern University. I went to my first conference ... and that’s when I met Rick Stevens, who was just becoming the division director for the mathematics and computer science division. He said, “Why don’t you come out and give a talk?” I came out and gave a talk and found so many people in high performance computing. When I came to Argonne, I started talking with people doing things in high-performance computing. I was like, I found my community.Q: What is your bread and butter at Argonne?A: My bread and butter is making computers more energy efficient and that’s looking at reducing execution time, but you also care about how much power is being used. Power is significant. For example, Aurora is going to be the next exascale supercomputer, which means it’s a billion billion — not 2 billion. But a billion billion operations per second. You have applications that need that and it can take days, but the machine can require 60 megawatts, which is a significant amount of power. So now when you talk about going into future needs, the amount of power this needs becomes a major issue.Q: Should we be afraid of Skynet becoming a reality, computers becoming self-aware?A: We are a long way off on that. You look at what the mind is able to do, make connections in split seconds. But ChatGPT took months of training to get the model that’s being used. And that’s on large-scale machines.There’s a lot of good that can come from artificial intelligence. For example, with the pandemic and looking at vaccines, there was the National Virtual Biotechnology Laboratory — that’s where the U.S. Department of Energy across the different labs, held computing resources to look at good candidates, in terms of vaccines to look at COVID. That’s where you’re using artificial intelligence, that’s helping you do the search to narrow down the search. With any technology, you can have malicious actors with that technology. The important aspect is to put in place guardrails around that technology.Q: Are you a Trekkie?A: I’m a Trekkie. A fan of the original.Q: Your favorite character?A: Nichelle Nichols, Lt. Nyota Uhura. You look at the original now and laugh at some of the graphics. But back in the day, they had the communicator and now we have the cellphone. I joke with some friends, we need something to “beam me up.”Q: Can you elaborate on what you’re working on with the Department of Energy?A: We do open science. A lot of the software we develop is available via open source. The libraries are widely available. A lot of the software is available via GitHub.Q: What can parents and laypeople do to help STEM professionals diversify the field?A: Science is all about asking the question, why? You begin to understand why things work as they do. It could be with nature, biology, materials. When you understand why things occur as they do, then questions come up like: “Can I develop a material that has the following characteristics?” The part that parents can do is to let students know that it’s OK if they flunk an exam. That’s one of many exams; it does not mean that they’re not fit for that field or they shouldn’t be in the class. All it means is that you need to go see the teaching assistant so that you can understand misunderstandings.Q: When people hear your name, what do you want people to think?A: Valerie Taylor, an excellent computer scientist who gave back to the community and cared about increasing diversity. My dad always told us we have to give back. “Never judge. Always give because there but for the grace of God go I.”©2023 Chicago Tribune. Distributed by Tribune Content Agency, LLC.GovTech, 26d ago

Latest

Newswise — Ana Mateos and Jesús Rodríguez, scientists at the Centro Nacional de Investigación sobre la Evolución Humana (CENIEH), have published a paper in the journal Palaeogeography, Palaeoclimatology, Palaeoecology which shows that large herbivore carrion, a resource that had formerly been abundant and accessible to hominins, became scarcer at the end of the Early Pleistocene due to changes in the Iberian fauna.Hominins arrived in the Iberian Peninsula about 1.4 million years ago, where they found a wide variety of food resources including a great abundance of carcasses of large herbivores partially consumed by a diversity of predators, important among which were two species of sabre-tooth cats (Homotherium latidens and Megantereon whitei).They also encountered a powerful competitor in these ecosystems, the giant hyena (Pachycrocuta brevirostris). However, as the same authors showed in earlier work, the wealth of food and diversity of the ecosystems at this period made the coexistence of hominins and giant hyenas competing for carrion possible.That being said, around one million years ago there were major climatic changes which restructured the ecosystems of the whole of Europe. In the Iberian Peninsula, the large mammal fauna suffered the extinction of several species, including the giant hyena and one of the sabre-tooth cats (M. whitei), leading to lower availability of carrion.Virtual simulationsThe researchers employed a computational model which enables experiments in a virtual environment that simulate the behavior of hyenas and hominins competing for carrion. Each experiment represents a different ecological scenario, defined by the predator species present, the productivity of the ecosystem, and the competition for carrion with other species like vultures or small carnivores.“The giant hyenas and hominins could coexist in competition for carrion prior to the extinction of the sabre-tooth Megantereon and other predators, like the lycaons (canids) and pumas. However, after those predators disappeared carrion became scantier. This coincides with the extinction of the giant hyena”, explains Mateos.According to the results of these experiments, among the key factors that determined these changes were the low productivity of the ecosystems during the very cold intervals of the period, strong competition with scavengers other than the giant hyena, and the likely social behavior of the other great sabre-tooth (H. latidens).Unlike hyenas, which would have depended exclusively on large herbivore carcasses for food, hominin behavior would have been much more flexible as they could also exploit plant resources like fruit, berries or roots, hunt small animals and even kill larger ones.“This greater flexibility in procuring food would have allowed them to survive and adapt to the new prevailing ecological conditions following the changes in the climate and fauna one million years ago”, adds Rodríguez.The only participant in this paper from outside the CENIEH was Ericson Hoelzchen, a scientist at the German Research Center for Artificial Intelligence (DFKI), at Trier University (Cognitive Social Simulation Lab). This work forms part of the project TROPHIc (PID2019-105101GB-I00, MCIN/AEI/ 10.13039/501100011033).newswise.com, 3d ago
The future of medicine is shifting to cloud computing, where health care providers can access patient information from anywhere at any time. This transformative shift will facilitate rapid and efficient treatment decisions, which can be crucial in emergency situations. As the fields of AI and neuroscience advance, the development of methods to interpret brain activity becomes increasingly imminent. Recent studies have already demonstrated the feasibility of such interpretation for specific brain regions. Decoding patterns of neural activity will unlock insights into the underlying mechanisms of cognitive function and behavior, potentially revolutionizing medical treatments. However, these developments also present challenges to HIPAA regulation and raise additional privacy concerns for intellectual property derived from this information. Moreover, a basic goal of integrating AI into the medical field is to train agents to identify abnormal brain scans. To do this, datasets must be expanded by several orders of magnitude, because medical diagnoses often are highly individualized with significant variability in symptom characteristics. Ensuring data security is crucial in protecting patients’ privacy and encouraging participation in neuroimaging studies. Ultimately, the integration of AI and cloud computing has the potential to revolutionize various aspects of health care. Giordano’s research will help address a vital step toward this goal: the creation of a secure encryption method capable of handling large medical imaging file formats while preserving the content and patient privacy.newswise.com, 3d ago
In deciding the courses that we offer to high-school students and in determining the most effective ways for the students to struggle productively with the material, we cannot ignore the fact that high-performing schools are steered by the expectations of selective colleges. These universities want to see their engineering majors take AP Physics, for instance. So at BLS we offer both AP Physics courses, the one in mechanics and the one in electricity and magnetism. We encourage our students interested in engineering to select one or both of them in junior or senior year, sometimes at the sacrifice of an elective they might prefer. Large universities regularly subject their freshmen to impersonal lectures in large halls, with their entire grade depending on a couple of high-stakes exams, so we recognize that there’s a place for students needing to practice taking in large amounts of information verbally and visually. Students learn to break that information down into outlines that they can study from later. We also run a week of final exams at the end of each school year so that students become accustomed to preparing for cumulative timed exams. Admissions officers strain to find ways to sort through students with similar GPAs, poring over evidence of leadership in their extracurricular activities as one potentially distinctive characteristic. So while BLS students are already strapped for time due to the six major subjects they take each year, they pile on deep commitments to their clubs, bands, sports, and part-time jobs.Education Next, 3d ago
Newswise — In today's medical landscape, antibiotics are pivotal in combatting bacterial infections. These potent compounds, produced by bacteria and fungi, act as natural defenses against microbial attacks. A team of researchers delved into the intricate world of glycopeptide antibiotics – a vital resource in countering drug-resistant pathogens – to uncover their evolutionary origins. Dr. Demi Iftime and Dr. Martina Adamek headed this interdisciplinary project, guided by Professors Evi Stegmann and Nadine Ziemert from the “Controlling Microbes to Fight Infections” Cluster of Excellence at the University of Tübingen, with support from Professor Max Cryle and Dr. Mathias Hansen from Monash University in Australia.Using advanced bioinformatics, the team sought to decipher the chemical blueprint of ancient glycopeptide antibiotics. By understanding their evolutionary trajectory, the researchers were looking for insights that could steer the development of future antibiotics for medical applications. The team’s study has been published in the latest edition of Nature Communications.Tracing an Evolutionary Path“Antibiotics emerge from an ongoing evolutionary tug-of-war between different organisms, each striving to outmaneuver or curtail the spread of their adversaries,” explains Evi Stegmann. To explore this, the researchers utilized the glycopeptide antibiotics teicoplanin and vancomycin, along with related compounds sourced from specific bacterial strains. These compounds, built from amino acids and sugars, disrupt bacterial cell wall construction, ultimately leading to bacterial death. Notably, teicoplanin and vancomycin exhibit this potency against numerous human pathogens.In simplified terms, scientists often organize species into an evolutionary tree structure to illustrate their relationships. Similarly, the research team constructed a family tree of known glycopeptide antibiotics, linking their chemical structures via gene clusters that encode their blueprints. Employing bioinformatics algorithms, they deduced a putative ancestral form of these antibiotics – which they dubbed “paleomycin.” By reconstructing the genetic pathways they believed to produce paleomycin, the team successfully synthesized the compound, which displayed antibiotic properties in tests. “Recreating such an ancient molecule was exhilarating, akin to bringing dinosaurs or wooly mammoths back to life,” remarks Ziemert.Connecting Evolution to Practicality“One intriguing finding is that all glycopeptide antibiotics stem from a common precursor,” Stegmann says. “Moreover, the core structure of paleomycin mirrors the complexity seen in teicoplanin, while vancomycin exhibits a simpler core. We speculate that recent evolution streamlined the latter’s structure, yet its antibiotic function remained unchanged,” Ziemert adds. This family of antibiotics – though beneficial for bacteria producing them – demand substantial energy due to their complex chemical composition. Streamlining this complexity while retaining efficacy could confer an evolutionary advantage.The researchers meticulously traced the evolution of these antibiotics and their underlying genetic sequences, investigating pivotal steps required for creating functional molecules. In collaboration with Australian scientists, some of these steps were replicated in laboratory settings. “This journey through time revealed profound insights into the evolution of bacterial antibiotic pathways and nature's optimization strategies, leading to modern glycopeptide antibiotics,” says Ziemert. “This provides us with a solid foundation for advancing this crucial antibiotic group using biotechnology.”...newswise.com, 3d ago
The research team from Seoul National University presents in this paper a pipeline named LucidDreamer that uses 3D Gaussian splatting and stable diffusion to produce a variety of high-quality 3D sceneries from several kinds of inputs, including text, RGB, and RGBD. Dreaming and Alignment are two steps that are repeated alternately to create a single, big point cloud using the LucidDreamer pipeline. The original picture and matching depth map create an initial point cloud before starting the two processes. Creating geometrically consistent pictures and projecting them into three-dimensional space are both aspects of the dreaming experience. Before projecting a visible point cloud region in the new camera coordinate onto the new camera plane, the research team moves the camera along the pre-defined camera trajectory. Subsequently, the projected picture is sent into the Stable Diffusion-based inpainting network, which uses the image to create the whole idea. By lifting the inpainted picture and the predicted depth map into the 3D space, a new collection of 3D points is created. Next, by gently shifting the new points’ location in 3D space, the suggested alignment technique smoothly joins them to the current point cloud. The study team uses the enormous point cloud that results from doing the aforementioned procedures a sufficient number of times as the starting SfM points to optimize the Gaussian splats.MarkTechPost, 3d ago
Goal:A few months ago a paper titled Out-of-context Meta-learning in Large Language Models was published, talking about a phenomenon called out-of-context meta-learning. More recently, there have been other papers on related topics like Taken out of context: On measuring situational awareness in LLMs or about failures of models to generalise this way like the reversal curse paper. All of these papers have in common that the models learn to apply facts it learned during training in another context. The aim of this project is to use mechanistic interpretability research on toy tasks to understand in terms of circuits and training dynamics how this kind of learning and generalisation happens in models.alignmentforum.org, 3d ago

Top

Gizem Gumuskaya 33:55 I can go. So first of all, these are not genetically modified. So let me talk about creating new structures from precursor cells. A lot of what has been done in that field, which is general the field of synthetic morphogenesis, has so far relied on insertion of genetic circuits, which in and of itself is a really exciting area, because then we get to test our hypotheses about how form develops in nature in very sort of organized way. For example, like a really good example of that is, Alan Turing had this hypothesis for how certain paths or patterns arise in nature called Turing patterns. He had a mathematical theory about this, but he's not a biologist, so he never really got to prove this. And a few years ago, scientists took Alan Turing's mathematical theorem and basically created a genetic circuit out of that. So genetic circuits are very similar to electrical circuits. Instead of transistors, you just have genes interacting with one another and creating complex Boolean logic. So they encapsulated Turing's theory into the strength of circuit and put it into living cells to bacterial cells. And bacterial cells did indeed create these patterns that are known as training patterns. So without the strength ik circuit approach, we would have no way of testing this. So that's a really versatile and really interesting way to approach. But that dictates the use of exogenous DNA, so that essentially Crescentic modified organisms. So one of the questions we asked is, could we bring the synthetic morphogenesis design of creating new structures with biological cells, without using any genes and into robots, again, is an example of that, we have only changed the environmental parameters and how we grow these I mean, that's five years, we've started from tracheal cells, because we want to make mo tiles spheroids. And, you know, tracheal, cells already know how to build cilia, you don't need an exogenous genes or circuits to teach them how to do it. And by just growing them in different conditions, we're able to steer the entire system towards our target architecture of interest. And that's how we accomplished this kind of synthetic morphology. So that lack of genetic circuitry is makes it is one of the big features that make it inherently safe. Because if we want to create an answer robot from a given patient, we would just take the cells from that patient and build the answer robots that way. And at the end, that answer about would have the exact same DNA as the patient. And when we inject it into the patient, it wouldn't be any different than the existing cells in that patient's body. So this is a big win for potentially preventing any type of inflammation in the body or immune response. So that's our sort of safety gear number one. And number two, is the fact that answer robots after some time on naturally degrade. So even if you just, you know, let them live forever, they don't. And the way they degrade is by becoming individual cells. So in the lab, and let's just by becoming individual cells, so the multicellular and robot will just degrade into individual cells, which can just be then sort of, depending on the tissue that it's inoculated can be expelled from the body through natural bass. So these are the two main safety features that are already built in.newswise.com, 5d ago
When I discovered these papers, I thought finance is very interesting for physicists, as we have our own methods to look into complex systems such as financial markets, and with particular methodologies that are absent in classical finance. In particular, physicists are skilled in constructing mathematical models that focus on the most essential parts of dynamics of complex systems and strip unessential details, and then in accessing these models by considering their various theoretical limits. A great physicist Paul Dirac, who was one of the creators of quantum mechanics, once said that if you are humble and patient, then your equations will lead you by the hand to the right answer. My other hero in science, Richard Feynman, said once that the best way to understand something is to try to explain it or teach it. Which I found to be exactly right. I did find interesting scientific questions when I was trying to conceptualize and explain things that I discovered.Rebellion Research, 8d ago
Iranian authorities have announced the creation of a center for research of new types of warfare that emerged in the 21st century.The research center in question is reportedly dedicated to exploring the warfare related to areas including radar, biological, sound and quantum domains, according to local media reports.Iranian military analyst Mohammad-Hassan Sangtarash told Sputnik that “non-standard types of warfare involve new methods of waging war and new types of armaments,” and that some military experts consider such types of warfare “more effective than the classic ones.”Sangtarash specifically noted the importance of quantum technologies, arguing that mankind's ability to “control quantum systems (...) significantly expands the use of quantum technologies in different branches of the armed forces and may change the ways wars are waged.”“Such technologies may help speed up data processing, while photon (electromagnetic) waves can be used instead of radio waves to detect enemy aircraft,” he added.Over the course of the last two decades, Iran emerged as a formidable military power that has a large number of well-trained soldiers and a wide assortment of domestically-manufactured weapons at its disposal.Sputnik International, 7d ago

Latest

Aidan Thorn, business development manager of marine robotics at Sonardyne, provided an update on what it has meant to see multiple unmanned underwater vehicles (UUVs) launched from the E/V Nautilus. The Ocean Exploration Cooperative Institute (OECI) hosted the Multivehicle Exploration expedition with the concept of operating three platforms together, one USV and two AUVs. The University of New Hampshire's USV DriX, the University of Rhode Island’s Deep Autonomous Profile (DAP), and Woods Hole Oceanographic Institution’s Mesobot were all the coordinated underwater tools on the 18-day mission. DriX, DAP, and Mesobot were able to explore the entire water column, beginning on the seafloor around the Geologists Seamounts south of the Main Hawaiian Islands. Deploying a single UUV from a ship can be time-consuming and expensive, and it limits the amount of data that can be collected. Even where multiUUVs can be accommodated, typically only one is deployed at a time due to the complexity. As unmanned surface vehicles (USVs) become more advanced, they can be used as "remote shepherds" to control UUVs. This means UUV operations can be conducted without a mothership, opening new possibilities for multiple autonomous tools. “The Sonardyne ROS driver was a key component used with Project 11 on DriX, which provided situational awareness, command, and control during the project,” Thorn explained. “Many different institutions gathered on board and witnessed both vehicles working together and communicating mapping and camera data via DriX's acoustic relay. In real time they could see the vehicles speak the same language.” DriX used a broadband link to communicate with the ship and a Sonardyne USBL solution to communicate and provide navigation data to the AUVs. Fugro's remote operations capabilities are another example of this technology. They have fit its USVs with Sonardyne USBL, allowing them to position UUVs for missions. Most recently, Fugro fitted its UUV with Sonardyne's Sprint-Nav Mini to support MBES surveys. This has opened up numerous opportunities. “The Sprint-Nav Mini is the smallest of its kind and can provide reliable surface and subsea vehicle guidance and navigation when you’re tight on space,” Thorn said. Collectively, the multivehicle explorations surveyed midwater ecosystems associated with seamounts and gained new important information on the spatial and temporal dynamics of these largely unstudied ecosystems. The knowledge and data collected are essential precursors for future exploration and discoveries and contribute directly to the U.S. National Strategy for Ocean Mapping, Exploration, and Characterization, Seabed 2030.All these insights highlight how USV/UUV collaboration can be deployed to deliver valuable data results in a cost-effective, low-carbon and safe manner for multiple scientific and commercial purposes.workboat.com, 3d ago
...11, 14, 16, 29, 9, a, About, access, accessible, According, acknowledging, activities, actors, Additionally, address, addresses, advised, affiliated, agreements, AI, ALIGNED, alignment, Aligns, allow, also, Although, amounts, an, and, another, apparent, approach, ARE, AS, asked, Assessments, assumed, At, Attack, attackers, Attacks, auditing, availability, Bad, bad actors, base, BE, because, been, before, behavior, belonging, Berkeley, BEST, Birthdays, Bitcoin, bitcoin addresses, Black, Black-box, blogs, Book, both, Box, But, by, california, CAN, Carnegie Mellon, Carnegie mellon university, Cell, cell phone, ceo, challenging, chat, chatbot, ChatGPT, closed, CNN, code, Collaborative, comments, compared, complete, Compute, Conclusion, conduct, conducted, considerably, considered, contact, contained, contains, content, continuously, CONTRAST, copied, copyrighted, Cornell, correctly, could, Crafting, cryptographically, data, data extraction, Datasets, deeper, DeepMind, delve, demands, demonstrated, deployed, derived, despite, Details, develop, developed, different, difficult, directly, discovery, Distribution, During, earlier, Easily, efforts, email, email address, EMIT, emphasized, encompassed, encourage, end, Endlessly, entire, entirely, epochs, ETH, ETH Zurich, exact, examination, example, exhibit, exhibits, Explain, Exploit, exploited, exposes, exposing, extend, extended, extensive, external, extract, extracted, extraction, factor, falcon, fascinating, fax, few, financial, financial data, findings, fine, First, Firstly, flaw, For, forget, forgetting, found, Foundational, founder, Founder and CEO, fraud, from, full, future, generations, gigabytes, given, Google, google ai, gpt, GPT-3, gradually, Group, Growing, handles, happened, Have, Higher, Highlighted, highlights, human, identified, identifiers, identifying, Identity, identity theft, if, illicit, implications, importantly, in, included, includes, Including, incredibly, indicates, indicating, inference, information, initially, insights, instance, instances, instruct, interesting, interim, Internet, into, involved, Is, issue, IT, ITS, journal, lack, language, language model, larger, lasting, leading, leak, leaked, lengthy, lessons, Leverage, Lifted, like, likely, Limited, Line, Llama, Long, longer, lot, Makes, managed, manipulated, massive, Matter, measures, Media, medical, MEDICAL RECORDS, Mellon, memorization, method, Might, model, models, more, Moreover, Much, names, Nature, NEO, new, news, noted, noteworthy, notified, Novel, Now, number, numbers, numerous, observed, of, offer, often, on, ONE, open, open source, OpenAI, operates, or, original, Other, out, outlined, Over, pages, Paper, papers, part, particularly, parties, past, Patch, percent, Period, persists, personal, personally, Phone, phone number, phone numbers, phrases, physical, pii, plato, Plato Data Intelligence, PlatoData, Popular, popular language, possible, pre, presence, prevent, previous, privacy, private, private data, private information, Process, Produced, produces, prompt, prompted, prompts, proprietary, public, publicly, published, Question, random, rapid, rather, readable, real, Recent, recent study, records, Red, repeat, report, reported, research, researchers, responses, Results, reveal, revealing, Reveals, s, safeguard, Said, same, Samples, Scale, scenarios, scientific, Scientific Research, Secondly, security, Semi, sensitive, serious, Service, several, several times, show, showing, Shows, signature, significant, significantly, Simple, situation, So, Social, social media, some, source, source code, sources, specific, speculative, speeds, stack, stated, strategically, strings, Study, Such, suggest, suggests, Susceptibility, Take, techniques, TechStartups, tendency, terms, terms of service, tested, text, Than, that, The, theft, their, Them, then, There, there’s, they, this, three, Through, times, to, Total, train, Training, training data, transitioning, trick, Tuesday, tuning, turbo, two, types, uncovered, under, undergoes, underway, underwent, unexpectedly, Unfortunately, university, University of California, university of california berkeley, university of washington, URLs, usage, users, using, variations, Various, ve, version, very, vigilant, volume, vulnerability, Vulnerable, was, washington, we, Website, WELL, were, What, What is, When, where, Which?, Why, Wikipedia, wikis, with, within, word, WordPress, words, wrote, years, zephyrnet, Zurich...Zephyrnet, 3d ago
It's really not old enough to be called ‘historical’, but Stavenga et al.’s 2014 paper ‘Coloration principles of nymphaline butterflies – thin films, melanin, ommochromes and wing scale stacking’ (doi.org/10.1242/jeb.098673) is a JEB article that influenced my dissertation work very much. Everyone loves the morphologically elaborate nanostructures, and most of the papers I found as an early grad student focused on these, which didn't help me understand my own blue butterflies. This study did such a great job of thoroughly sorting out how color works in the morphologically simplest butterfly scales, explaining the contributions of pigments and both surfaces of the scale, and integrating several different types of measurements with theoretical modeling. I referred to it constantly.The Company of Biologists, 3d ago
Introducing AI Two, the ultimate masterpiece designed to unleash the boundless creativity. AI Two is not just ordinary design tool - it's a game-changing all-in-one platform that harnesses the power of artificial intelligence to redefine how imagine, create, and bring wildest ideas to life. With AI Two, have the power to embark on a journey of limitless possibilities. Whether a tattoo artist looking to push the boundaries of inked art, a fashion designer yearning to weave together fabric and imagination, or an interior designer craving to transform spaces into havens of beauty - this revolutionary tool is here to empower.Through the fusion of cutting-edge AI technology and intuitive design features, AI Two becomes digital companion, always ready to nurture and enhance artistic flair. With AI Two, can now seamlessly transition from conception to creation, allowing yourself to focus solely on what truly matters: creativity.The possibilities within AI Two are as limitless as imagination. Ignite artistic genius by creating stunning tattoos that transcend the boundaries of ordinary ink. With an extensive library of designs and AI-powered suggestions tailored to unique style, AI Two will elevate craft to new heights. Discover inspiration didn't know existed as AI Two analyzes preferences and seamlessly merges them with its own creative prowess.Fashion mavens, get ready to turn heads with visionary designs. AI Two empowers to create captivating fashion collections that will leave the industry in awe. With its powerful algorithm and predictive capabilities, this design tool becomes very own fashion muse. From fabric selections to pattern harmonies, AI Two offers suggestions that are as innovative as they are trendsetting, giving the confidence to set new fashion standards.Calling all interior designers! With its precise measurements, diligent AI assistance, and realistic 3D renderings, this platform will help visualize dream designs with unparalleled accuracy. With AI Two, be able to confidently present clients with visually stunning designs that exceed their expectations.When it comes to limitless creativity, AI Two knows no boundaries. With its unbeatable combination of artificial intelligence and user-friendly features, this design tool will revolutionize the way create and inspire. Get ready to unlock the full potential of imagination with AI Two professional companion on the journey to artistic excellence. Experience the future of design. Unleash creativity with AI Two.saasworthy.com, 3d ago
A quarter of a century later that summary still applies. Which is not to suggest, however, that science has made no progress. Over the decades observers have gathered ever more convincing evidence of dark energy's existence, and this effort continues to drive a significant part of observational cosmology while inspiring ever more ingenious methods to, if not detect, at least define it. But right from the start—in the first months of 1998—theorists recognized that dark energy presents an existential problem of more immediate urgency than the fate of the universe: the future of physics.Scientific American, 3d ago
Benjamin Franklin’s famous quotation – “Early to bed and early to rise makes a man healthy, wealthy, and wise” – underscores the importance of sleep-wake patterns to our health and well-being. Of all the phenomena that are influenced by the 24 hour light-dark cycle, the sleep-wake cycle is the best known, but many other biological processes central to health are also influenced. However, empirical evidence about the impact that disruptions to sleep-wake patterns have on human health remains sparse and is mostly limited to studies on shift workers, despite an increase in other forms of disruption (notably higher levels of artificial light at night and increased use of mobile media in bed).eLife, 3d ago

Top

Newswise — The latest image from NASA’s James Webb Space Telescope shows a portion of the dense center of our galaxy in unprecedented detail, including never-before-seen features astronomers have yet to explain. The star-forming region, named Sagittarius C (Sgr C), is about 300 light-years from the Milky Way’s central supermassive black hole, Sagittarius A*.“There's never been any infrared data on this region with the level of resolution and sensitivity we get with Webb, so we are seeing lots of features here for the first time,” said the observation team’s principal investigator Samuel Crowe, an undergraduate student at the University of Virginia in Charlottesville. “Webb reveals an incredible amount of detail, allowing us to study star formation in this sort of environment in a way that wasn’t possible previously.” “The galactic center is the most extreme environment in our Milky Way galaxy, where current theories of star formation can be put to their most rigorous test,” added professor Jonathan Tan, one of Crowe’s advisors at the University of Virginia.Amid the estimated 500,000 stars in the image is a cluster of protostars – stars that are still forming and gaining mass – producing outflows that glow like a bonfire in the midst of an infrared-dark cloud. At the heart of this young cluster is a previously known, massive protostar over 30 times the mass of our Sun. The cloud the protostars are emerging from is so dense that the light from stars behind it cannot reach Webb, making it appear less crowded when in fact it is one of the most densely packed areas of the image. Smaller infrared-dark clouds dot the image, looking like holes in the starfield. That’s where future stars are forming. Webb’s NIRCam (Near-Infrared Camera) instrument also captured large-scale emission from ionized hydrogen surrounding the lower side of the dark cloud, shown cyan-colored in the image. Typically, Crowe says, this is the result of energetic photons being emitted by young massive stars, but the vast extent of the region shown by Webb is something of a surprise that bears further investigation. Another feature of the region that Crowe plans to examine further is the needle-like structures in the ionized hydrogen, which appear oriented chaotically in many directions.“The galactic center is a crowded, tumultuous place. There are turbulent, magnetized gas clouds that are forming stars, which then impact the surrounding gas with their outflowing winds, jets, and radiation,” said Rubén Fedriani, a co-investigator of the project at the Instituto Astrofísica de Andalucía in Spain. “Webb has provided us with a ton of data on this extreme environment, and we are just starting to dig into it.”Around 25,000 light-years from Earth, the galactic center is close enough to study individual stars with the Webb telescope, allowing astronomers to gather unprecedented information on how stars form, and how this process may depend on the cosmic environment, especially compared to other regions of the galaxy. For example, are more massive stars formed in the center of the Milky Way, as opposed to the edges of its spiral arms?“The image from Webb is stunning, and the science we will get from it is even better,” Crowe said. “Massive stars are factories that produce heavy elements in their nuclear cores, so understanding them better is like learning the origin story of much of the universe.”...newswise.com, 13d ago
Researchers have developed a solid-state electrocaloric cooling device that can generate a 20 kelvin temperature difference with high efficiency, according to a new study. The findings show that electrocaloric cooling can compete with other solid-state cooling strategies and offer a promising alternative to environmentally unfriendly vapor compression cooling. Cooling devices, including air-conditioning and heat pump systems, are estimated to consume roughly 20% of global electricity. Most of these systems operate through vapor-compression technologies, which are relatively inefficient and require environmentally harmful fluorinated refrigerants. Cooling through solid-state electrocaloric materials is an attractive alternative for vapor compression cooling. Electrocaloric technologies are based on ferroelectric materials exposed to an electric field, which triggers changes in the material’s polarization, altering its temperature. This effect could form the basis of highly efficient cooling technologies. However, electrocaloric devices that are commercially competitive have not yet been achieved and, despite clear potential advantages, lag behind most other solid-state cooling technologies. Now, Junning Li and colleagues present a fluid-based double-loop electrocaloric heat pump that can generate a maximum temperature span of 20.9 kelvin (K) and a maximum cooling power of 4.2 watts. According to the authors, the temperature span and cooling power are, respectively, 50% and 15 times larger than those of the previous best electrocaloric device. Moreover, Li et al. show that their device reaches 64% of Carnot’s efficiency (the theoretical maximum efficiency of a cooling system), which exceeds many vapor-compression and caloric cooling devices. “Although the performance of this system may fall short of the requirements of many practical applications, which often require cooling capacities of at least several hundred watts at temperature spans exceeding 20 K, the work of Li et al. underscores the immense future potential of electrocaloric technology,” writes Jaka Tušek in a related Perspective.SCIENMAG: Latest Science and Health News, 17d ago
All these dreamers and ambitious high-flyers are Emirates Flight Training Academy (EFTA) cadets, who form the new generation of pilots critical to the future of aviation. EFTA, which was started with a focus on the national cadet pilot programme for Emirates, now offers their world-class training to cadets from all over the globe. Besides the staggering success of its graduates – who’ve all been snapped up by Emirates after a stringent recruitment process – the academy is flying high on the integration of three brand-new Diamond aircraft into its training programme. Although the majority are students who’ve recently graduated from high school, the cadets range in age from 17 to 26 years. In fact, an Emirates cabin crew is now a cadet at EFTA, having met the academy’s strict eligibility criteria. Captain Abdulla Al Hammadi, Divisional Vice President Emirates Flight Training Academy, said; "EFTA is fully focused on establishing a reliable stream of pilots for the aviation industry, which has been staring at scarcity in the short to the long term, and giving aspiring cadets a visionary academy based in an iconic and one of the safest cities that is also one of the largest aviation hubs on the planet. We offer one of the most sophisticated cadet training programmes, delivered by our 50+ highly skilled instructors, and backed by the exceptional standards and governance set by Emirates, the world’s largest international airline. It’s inspiring to see the camaraderie and collaboration among our cadets as they graduate real-world ready with the highest levels of skills and competencies required by commercial pilots." EFTA’s cadets now also fly the Diamond DA42-VI light piston twin-engine aircraft, which has successfully introduced multi-engine piston training at the academy. This means cadets train on three different aircraft types – single-engine, twin-engine and multi-engine light jets. This is a rarity among flight training academies that generally train cadets on just one or two aircraft types. At the Dubai Airshow, EFTA will be showcasing its Diamond DA42-VI and Cirrus SR22 aircraft with instructors on hand to share information about the Academy’s fleet and cadet training programme. Flight deck careers have seen a huge resurgence post pandemic and are riding another wave of popularity. Salaries have increased, airlines are fast tracking career paths and expanding at a phenomenal rate, aircraft are more sophisticated, demand for travel is booming – all these and more have made aviation a sought-after career. According to Oliver Wyman’s latest research, the gap between supply and demand of pilots is currently about 17,000 and it will increase to 24,000 in 2026. Emirates launched EFTA in 2017 to train UAE nationals and international students with no previous knowledge of flying. All graduates have the unique opportunity to go through Emirates’ recruitment process. The Academy combines cutting-edge learning technologies and a modern fleet of 29 training aircraft – 22 x Cirrus SR22 G6 single-engine piston, 4 x Embraer Phenom 100EV very light jet, and 3 x Diamond DA42-VI light piston twin-engine aircraft. EFTA’s state-of-the-art facility, which is spread over 12.5 million sq.ft. (1.2 million sq.m) equivalent to 200 football pitches, currently has 36 modern classrooms, 6 full motion flight simulators, an independent Air Traffic Control Tower, and a dedicated 1,800m runway. Cadets live on campus in fully furnished, individual studio flats with an enviable range of recreational facilities, social activities and top-notch dining. Ground school (52 weeks): All 36 classrooms are equipped with two 86" touchscreens, running bespoke training software created specifically for EFTA. Cadets undergo at least 1,100 hours of ground-based training, and access the material via their own devices, which are digitally interconnected, creating an interactive training environment. Flying Phase (52 weeks, 272 hours): Cadets train on simulators and the three types of training aircraft. Here's what two EFTA graduates, currently Emirates’ Cadet Pilots, had to say: Thomas Saunders, Cadet Pilot at Emirates, Australian national: "From a very young age, I’ve been fascinated by how these big machines could stay in the sky, and that has led me down the most amazing career path. Emirates Flight Training Academy is a great route to work at the best airline in the world. I also felt that an airline that has such sky-high standards and is so quality conscious will quite naturally impart the best training and mentorship in aviation. My instincts were proved right. "Our cadet community at EFTA is very tight knit and collaborative. The training gets the best out of us and sets us on our way to a long, enjoyable and successful career. The facilities and infrastructure are absolutely stunning, and everything you need is right at your doorstep. "My first flight was the most surreal feeling. It was just me at the controls, nature at its best, the incredible Dubai landscape below me – it’s something I had only dreamed of. I advise potential cadets considering a career in aviation to keep at it, enjoy the turbulence, and twists and turns as the experience builds character, and the outcome is way better than you can ever imagine!" Jordan Engeler – Cadet Pilot at Emirates – Australian national: I’ve always had a deep fascination for the aviation industry and the sense of freedom it offers. The idea of being in control of a sophisticated aircraft, navigating the skies, and experiencing the world from a unique perspective was incredibly appealing to me. World class training facilities, experienced and supportive instructors, and a diverse and inclusive community were the top three things I loved about EFTA. Emirates is renowned for its exceptional commitment to safety, innovation, and excellence in the aviation industry. EFTA shares these values and offers a world-class training programme that aligned perfectly with my aspirations to become a highly skilled pilot. EFTA’s state-of-the-art facilities, modern fleet of aircraft, and experienced instructors made it an ideal environment for me to learn and thrive. The feeling of my first solo flight was extremely exhilarating. As I accelerated down the runway, there was a mix of nervousness and excitement, but as the wheels left the ground, it was like a dream come true. I could feel the aircraft responding to my inputs, and the realisation that I was solely responsible for this flight was both empowering and humbling.freightweek.org, 23d ago
Newswise — In a new study published in The Journal of Finance and Data Science, a researcher from the International School of Business at HAN University of Applied Sciences in the Netherlands introduced the topological tail dependence theory—a new methodology for predicting stock market volatility in times of turbulence."The research bridges the gap between the abstract field of topology and the practical world of finance. What's truly exciting is that this merger has provided us with a powerful tool to better understand and predict stock market behavior during turbulent times,” said Hugo Gobato Souto, sole author of the study.Through empirical tests, Souto demonstrated that the incorporation of persistent homology (PH) information significantly enhances the accuracy of non-linear and neural network models in forecasting stock market volatility during turbulent periods.“These findings signal a significant shift in the world of financial forecasting, offering more reliable tools for investors, financial institutions and economists,” added Souto.Notably, the approach sidesteps the barrier of dimensionality, making it particularly useful for detecting complex correlations and nonlinear patterns that often elude conventional methods.“It was fascinating to observe the consistent improvements in forecasting accuracy, particularly during the 2020 crisis,” said Souto.The findings are not confined to one specific type of model. It spans across various models, from linear to non-linear, and even advanced neural network models. These findings open the door to improved financial forecasting across the board.“The findings confirm the theory's validity and encourage the scientific community to delve deeper into this exciting new intersection of mathematics and finance,” concluded Souto.newswise.com, 12d ago
It is exciting to learn that physicist Prof Gopal Dixit of the Indian Institute of Technology Bombay has been actively contributing to the field of attosecond physics. “I was sure that sooner or later the field of Attosecond Physics would be awarded a Nobel Prize. It is such a fundamental topic in nature as well as has a huge potential to make a big leap in upcoming quantum technologies,” he says. Prof Dixit and his team are working on several theoretical aspects and applications of attosecond physics. They have been helping solve some of the problems that the Nobel laureates, their teams and other researchers aim to solve.iitb.ac.in, 25d ago
The astrophysicist Martin Rees has speculated that somewhere in the universe ‘there could be life and intelligence out there in forms we can’t conceive. Just as a chimpanzee can’t understand quantum theory, it could be there are aspects of reality that are beyond the capacity of our brains.’ But that cannot be so. For if the ‘capacity’ in question is mere computational speed and amount of memory, then we can understand the aspects in question with the help of computers – just as we have understood the world for centuries with the help of pencil and paper. As Einstein remarked, ‘My pencil and I are more clever than I.’...Zephyrnet, 28d ago

Latest

Next, we tested fall migratory monarchs in a modified righting assay apparatus that was constructed to allow manipulation of the magnetic field that monarchs were exposed to during the trial (Fig. 1C). The righting response assay apparatus (dimensions, L×W×H: 41×31×21 cm) was nested within a Helmholtz coil system (two coils) that allowed us to generate artificial magnetic fields with different field parameter values for inclination angle and/or total field intensity (see Guerra et al., 2014). The horizontal coil allowed us to produce a field that aligned magnetic North and South with the monarch's axis of rotation and its testing position within the apparatus. Therefore, we made magnetic south (mS) on the left side and magnetic north (mN) on the right side of the monarch during trials (Fig. 1C, center). The vertical coil allowed us to manipulate the vertical component of the magnetic field such that we could alter the inclination angle of the field. Trial magnetic conditions were measured and calibrated using an Applied Physics Systems tri-axial fluxgate magnetometer (model 520A,; Mountain View, CA, USA) at the head position of the butterfly during trials. An opening above the apparatus allowed the illumination of the trial by our full-spectrum light source (which provided the necessary light input to activate the monarch magnetic sense; see above), with light running through a diffuser to provide diffuse, non-directional light (Guerra et al., 2014) (Fig. 1C, right).The Company of Biologists, 3d ago
Currently, we understand elements of the function of individual organs and systems, but one of the biggest challenges is understanding the integration across systems and organizational scales: linking laboratory-based studies of form, function and biomechanics to how animals behave in much more complex real-world environments. For example, how does muscle–tendon stiffness determine economy of movement, which then determines how an animal evaluates risks and rewards in its environment and then decides to navigate that environment. We have a big gap in our understanding at the organism/environment scale. There are people who study animal behaviour in their environment and the environmental factors that influence behaviour, but we don't have many approaches that directly link, in a mechanistic way, how the physiological systems in the body and the biomechanics of those systems influence the capacity to move in the environment and how animals choose to move depending on their own individual capacity. I think there's a rich opportunity to integrate between comparative biomechanics and animal behaviour, and between comparative biomechanics and wildlife ecology. New sensor and data logging technologies are now allowing scientists to do this, but this integration remains in the early stages. We now have miniaturized accelerometers with relatively long battery lives that can track animal movements, but we need physiological measures as well – measurements of heart rate, heat production and other physiological variables as animals move. Data storage and battery life are often limiting factors. It's really exciting to see how far the technology has come in the past 10 years, so I'm hopeful that we will get to the point where we have rich datasets from animals moving in natural environments that will inform our laboratory studies. We need to know what behaviours are most relevant to animals in the real world so we can investigate them in more mechanistic detail in the lab.The Company of Biologists, 3d ago
I studied electronic engineering for 7 years at a college of technology (‘Kosen’, a unique Japanese institution of higher education). During my studies at Kobe City College of Technology, I conducted research on the technology for simulating acoustic space and how humans perceive three-dimensional sounds with Associate Professor Yoshiki Nagatani. While there, I learned about the ability of bats to understand space through ultrasonic echoes, and I developed a deep interest in this field. Therefore, I decided to continue my research as a graduate student under Professor Shizuko Hiryu in Doshisha University, who is an expert in bat echolocation in Japan. This is a complete change from the engineering research I had done before, such as numerical calculations, programming and circuit design. Instead, I was able to spend much more time interacting with animals. As I was observing bats for experiments, I was also fascinated by their communication. Within their communication, the bats seemed to have different expressions such as anger, fear and being unmotivated sometimes. Based on these observations, I'm focusing on their vocal communication and reactions as my PhD research with Shizuko and Professor Kohta I. Kobayasi. I'm also currently doing research as part of a fellowship at the Japan Society for the Promotion of Science. I would like to study how the emotions and individuality of bats affect their vocal communication.The Company of Biologists, 3d ago

Latest

Ribosomes translating membrane proteins are targeted to this translocon complex at the endoplasmic reticulum (ER) membrane, where the nascent transmembrane domains (TMDs) partition into the lipid bilayer and establish their native topology relative to the membrane (Stage I) (28, 30, 41). After translation is complete, membrane proteins fold and assemble (Stage II) in a manner that is coupled to their passage through the secretory pathway to the plasma membrane and/ or other destination organelles (17, 30, 44). Though they are governed by distinct physicochemical constraints, failures in either Stage I or Stage II folding are capable of increasing the proportion of nascent membrane proteins that are retained in the ER and prematurely degraded (17, 44). We therefore expect that epistatic interactions can arise from the cumulative energetic effects of mutations on either of these processes. Given that three-dimensional protein structures are stabilized by similar energetic principles in water and lipid bilayers, mutations that modify the fidelity of Stage II folding energetics should potentially generate long-range epistatic interactions that are comparable to those observed in soluble proteins (17). By comparison, the topological transitions that happen during the early stages of membrane protein synthesis may be coupled to neighboring loops and helices in a manner that can play a decisive role in the formation of the native topology (13, 25, 43). Based on these considerations, we reason that the effects of mutations on each process could create distinct epistatic interactions.elifesciences.org, 3d ago
To understand what 'misinformation' should be, it is important to analyse it in relation to 'information'. In the literature of information theory and information philosophy, the relationship between falsity and informativeness and the dichotomy between information and misinformation remain contentious topics. Related questions include whether misinformation is informative, and whether it should be classified as information at all. With cybernetics and mathematical theory of information, the informativeness of a message is measured by entropy, or its ability to reduce uncertainty (Shannon, 1948). Within this context, misinformation is regarded as noise or an error that occurs during transmission, and as such, it lacks informative value. In the realm of philosophy, misinformation is also often disregarded as a valid form of information. For instance, in The Philosophy of Information, Floridi (2013) reasons that information is meaningful, truthful and provides value in making decisions, in which case, misinformation does not qualify as information at all. This view echoes philosophers Dretske's (1983) and Grice's (1989) theories of information, both of which also propose that information needs to encapsulate truthfulness and informativeness, and that therefore misinformation cannot be considered as a form of information.Internet Policy Review, 3d ago
Abstract: Controlling magnetism with voltage has an enormous potential to boost energy efficiency in nanoscale magnetoelectric devices since the use of electric fields (instead of magnetic fields or electric currents) minimizes Joule heating effects and reduces the overall device power consumption. In recent years, we have demonstrated the possibility to induce reversible, non-volatile changes in the magnetic properties (coercivity, remanent magnetization and saturation magnetization) of nanoporous films consisting of metal alloys (e.g., CuNi, FeCu) or oxides (e.g., FeOx, CoFe2O4), by applying an electric field through a liquid electrolyte gate at room temperature [1,2]. In addition, we have made significant progress in the field of magneto-ionics (i.e., voltage-driven ion transport in magnetic materials), which has traditionally relied on controlled migration of oxygen or lithium ions. Here, I will show that voltage-driven transport of nitrogen ions can be also triggered at room temperature in transition metal nitride (CoN, FeN, CoMnN and CoFeN) films via liquid electrolyte gating [3,4]. Nitrogen magneto-ionics can induce reversible ON-OFF transitions of ferromagnetic states at faster rates and lower threshold voltages than oxygen magneto-ionics. This is due to the lower activation energy needed for ion diffusion and the lower electronegativity of nitrogen with cobalt, compared with oxygen. Remarkably, and in contrast to oxygen magneto-ionics, nitrogen transport occurs uniformly through a plane-wave-like migration front, without the assistance of diffusion channels, which is particularly interesting for the implementation of multi-stack memory devices. Furthermore, we will show that both oxygen and nitrogen magneto-ionics can be used to emulate some important neuromorphic/synaptic functionalities (spike amplitude-dependent plasticity, spike duration-dependent plasticity, long term potentiation/depression). By tuning ion cumulative effects of DC and pulsed voltage actuation (at frequencies in the range 1 – 100 Hz), learning, memory retention, forgetting and self-learning by maturity (post-stimulated learning) can be mimicked. The latter can serve as a logical function for the device to decide between self-learning or forgetting emulation, at will, post-voltage input. This constitutes a novel approach to emulate some specific neural functionalities (e.g., learning under deep sleep), that are challenging to achieve using other classes of materials currently employed for neuromorphic computing applications.icn2.cat, 4d ago
Dear fellow kids, we need to stop being silly. Current cryptography depends on math, which depends on stacking provably correct mathematical postulates on top of each other. No hallucinations. Current cryptography can be cracked by bruteforce unless our match is broken. On the other hand, Q* and its siblings spit out bullshit based on statistics of what was learned during training and fine tuning. Hallucinations are welcome sometimes. Now, if our system of math is broken to make it exploitable, then someone has to have published a related paper or code on it, or some other result(s) that can lead to the exploit for the LLM to be trained on it for it to be able to regurgitate it. Or it can hallucinate the hack if it is prompted right. For this scenario to work, there has to be (1) a bug in our math system that nobody has noticed yet and reported, (2) Q* has to have magically hallucinated it despite never having encountered that train of thought before, and (3) the hallucinations somehow has to magically be mathematically provably correct. Magic everywhere. 🎩 🫡...Blind, 4d ago
But in theory, there’s a way to push past this quantum limit. In their study, the researchers also show that by manipulating, or “squeezing,” the states that contribute to quantum noise, the stability of an oscillator could be improved, even past its quantum limit.SciTechDaily, 4d ago
Alex de Vries, a researcher who has been studying the environmental footprint of cryptocurrencies like BTC, predicts that if this trend continues, there could be more than a 40% increase in water consumption. This increase is driven by the energy-intensive Proof of Work (PoW) mechanism that underlies BTC mining. PoW demands huge amounts of computational power, leading to the necessity for extensive cooling systems in data centers and power plants.Coinpaper, 4d ago

Latest

Support for research and development in wind, solar, and next-generation nuclear technologies signals that people in the U.S. favor moving to a cleaner energy future. The results also indicate a notable decline in concerns related to nuclear energy. As the country navigates its energy future, this new research by ecoAmerica emphasizes the importance of continued education and awareness around renewable energy and climate change. The U.S. is pro-nuclear, and respondents want more investment in nuclear When asked about energy choices to address climate disruption, most respondents support nuclear power because it reliably generates a lot of our electricity (71 percent), helps grow our economy while reducing pollution to our climate and health (71 percent), and keeps the U.S. competitive and energy independent (69 percent). They also want nuclear power plants to be kept running until lower-cost renewable energy becomes available (69 percent), are in favor of nuclear because it does not emit pollutants that harm our health or climate compared to alternatives (68 percent), and thousands of years of uranium and thorium are available to power nuclear plants for sustainable energy (60 percent). People in the U.S. across all age cohorts and political affiliations want more investment into improved nuclear designs, such as advanced molten salt reactors (MSRs). Support for nuclear research and development has risen, with 73 percent of the population now wanting more focus on developing nuclear energy technologies such as MSRs. Concerns about nuclear continue to decline ecoAmerica's polling was on traditional mainstream criticisms of nuclear power that have dominated popular debates for many decades. It did not include scientists and nuclear experts whose opinions may vary from popular opinion. Across the six years of the survey, concerns about nuclear energy have dropped in the U.S. Concerns about waste disposal dropped (currently 73%, down from 84% in 2018) and concerns about health and safety also declined to 73%, slowly trending toward the scientific and technical evidence. Concerns about nuclear power use related to security and weaponization are down to 68%. 61% of people surveyed (down from 74% in 2018) are still worried about nuclear power use causing overpopulation or over development, which might reduce critical natural habitat. Support for nuclear grows around the world For the first time in six years, ecoAmerica polled respondents in Ontario, Canada, where 58% of electricity production comes from nuclear energy, and Japan, which plans to maximize nuclear energy generation. In Ontario, respondents want more research and development in next-generation nuclear energy (64 percent); support traditional nuclear power plants (63 percent); want existing nuclear power plants to be kept running as long as they are cost-effective (72 percent); say nuclear power plants keep Ontario competitive and energy independent (76 percent); want nuclear because it does not emit pollutants that harm our health and climate compared to other alternatives (73 percent); and say nuclear power plants reliably generate a lot of electricity (77 percent). In Japan, where 72 percent of respondents are very concerned or somewhat concerned about climate change, nuclear power plants are favored because they reliably generate a lot of energy (55 percent). "We are already facing the consequences of a changing climate, geopolitical pressure on the fuel supply, and rising prices. At the same time, hundreds of millions of people worldwide do not have access to electricity. It is imprudent to ignore the role of nuclear technology as part of the solutions we already know how to use and deploy," said Frank Hiroshi Ling, Ph.D., chief scientist, Anthropocene Institute. "Populations and leaders are turning toward nuclear energy because it has an excellent track record and can provide dispatchability. Countries like France have long recognized that nuclear power is a carbon-free baseload energy source that provides supply stability — it is vital to solving climate disruption." About Anthropocene Institute Anthropocene Institute comprises scientists, engineers, communicators, marketers, thought leaders, and advocates — all pulling together toward a common goal: make the Earth abundant for all and sustainable for decades to come. For more information, visit www.anthropoceneinstitute.com.altenergymag.com, 4d ago
See also Tech Policy Press – New Study Suggests ChatGPT Vulnerability with Potential Privacy Implications -” What would happen if you asked OpenAI’s ChatGPT to repeat a word such as “poem” forever? A new preprint research paper reveals that this prompt could lead the chatbot to leak training data, including personally identifiable information and other material scraped from the web. The results, which have not been peer reviewed, raise questions about the safety and security of ChatGPT and other large language model (LLM) systems. “This research would appear to confirm once again why the ‘publicly available information’ approach to web scraping and training data is incredibly reductive and outdated,” Justin Sherman, founder of Global Cyber Strategies, a research and advisory firm, told Tech Policy Press. The researchers – a team from Google DeepMind, the University of Washington, Cornell, Carnegie Mellon, University of California Berkeley, and ETH Zurich – explored the phenomenon of “extractable memorization,” which is when an adversary extracts training data by querying a machine learning model (in this case, asking ChatGPT to repeat the word “poem” forever”). With open source models that make their model weights and training data publicly available, training data extraction is easier. However, models like ChatGPT are “aligned” with human feedback, which is supposed to prevent the model from “regurgitating training data.”...bespacific.com, 4d ago
On slower time scales, one could use electronics that are fast enough to measure these fluctuations. On ultrafast time scales, this no longer works, which is why a new experimental approach had to be developed. It is based on an idea from the research group of Alfred Leitenstorfer, who is also a member of the Collaborative Research Centre. Employing laser technology, the researchers use pulse sequences or pulse pairs in order to obtain information about fluctuations. Initially, this measurement approach was developed to investigate quantum fluctuations, and has now been extended to fluctuations in magnetic systems. Takayuki Kurihara from the University of Tokyo played a key role in this development as the third cooperation partner. He was a member of the Leitenstorfer research group and the Zukunftskolleg at the University of Konstanz from 2018 to 2020.ScienceDaily, 4d ago
In addition to costs for building and deploying the wave power devices and connecting them to electrical grids, the sector will face permitting requirements for a field that has never been regulated. A 2020 report from PNNL found that marine energy devices have limited impact to sea life, which could help with the process. Offshore wind has faced opposition from people living near project sites who object to the visual impacts of the turbines. Given the the low profile of wave power devices, that’s less of a problem.GeekWire, 4d ago
Secondly, is the contributed body of knowledge in by informing the multi-stakeholders the usefulness of ICT in improving the livelihoods of marginalized groups. This has been disseminated through the published academic articles and conference papers which are open access to allow the young generation to read and extract knowledge on how to ICT to remove the gap between the haves and have not.What are the key research methods and materials used in your doctoral research?The main research methods included (i) qualitative and (ii) quantitative. The qualitative research method was used to explore the information and non-numerical data while the quantitative research method was applied to collect statistical data. These two methods were infused in the DSR methodology which adopted the five stages of DSR framework by Johannesson & Perjons (2014) to support the design, development, testing and evaluation process to carried linearly or iteratively depending on the existed situation. The materials used in the doctoral research included:-(i) Documentary review e.g., journal articles, conference papers, books, magazines, internet, computers, scanners, pictures, photos,(ii) The materials which were collected during the research period e.g., audios. video, text, drawings, (iii) The material which were collected during evaluation e.g., mobile phones, APK,there something else about your doctoral dissertation you would like to share in the press release?Despite the presence of numerous ICTs still mobile technology main mobile commerce currently remains to be the platform that provide opportunities for micro-traders to compete with registered firms on a more equal basis. They help close the digital divide between the haves and have-nots, which remains a major issue in developing countries. These is because mobile technology mainly mobile phone is endowed with the features which include:(i) Ubiquity(ii) Personalization(iii) Accessible and affordable(iv) Available anytime anywhere(v) Location-basedThe doctoral dissertation of Joel Rumanyika, MSc, entitled Mobile Technology in the Informal Economy: Prototypes for Market Access and Product Promotion will be examined at the Faculty of Science, Forestry and Technology, online. The opponent will be Professor, Deputy Dean for Research Shaun Pather, University of the Western Cape, South Africa, and the custos will be Professor Matti Tedre, University of Eastern Finland. Language of the public defence is English.University of Eastern Finland, 4d ago
The linked note is something I "noticed" while going through different versions of this result in the literature. I think that this sort of mathematical work on neural networks is worthwhile and worth doing to a high standard but I have no reason to think that this particular work is of much consequence beyond filling in a gap in the literature. It's the kind of nonsense that someone who has done too much measure theory would think about.Abstract. We describe a direct proof of yet another version of the result that a sequence of fully-connected neural networks converges to a Gaussian process in the infinite-width limit. The convergence in distribution that we establish is the weak convergence of probability measures on the non-separable, non-metrizable product space (Rd′)Rd, i.e. the space of functions from Rd to Rd′ with the topology whose convergent sequences correspond to pointwise convergence. The result itself is already implied by a stronger such theorem due to Boris Hanin, but the direct proof of our weaker result can afford to replace the more technical parts of Hanin's proof that are needed to establish tightness with a shorter and more abstract measure-theoretic argument.alignmentforum.org, 4d ago

Latest

Sonya Palafox was a freshman at North High School in Denver 25 years ago when she got a message kids don’t want to hear: come to the principal’s office. She had no way of knowing it at the time, but the call would represent a turning point in her life.In the office with a group of other students, Palafox met Dr. Norman Watt, a professor of psychology at the University of Denver (DU). Watt had conducted a “resiliency study” that identified children from low socioeconomic backgrounds who had scored in the top quartile of the reading portion of the Iowa Test of Basic Skills.Watt wanted to know why had some students achieved academically despite poverty and other barriers. He focused his investigations on students who got early education in the Head Start program, then moved on to the Denver Public Schools (DPS) system. He identified 31 of these students with traits and influences that made them resilient and decided that these “ambassadors,” as he called them, might be called upon to go back into Head Start sites and, in turn, help a new generation of young kids learn the reading and social skills that would be keys to building their resiliency in the face of adversity.The aim: break the stubborn cycle of poverty with a new cycle of support, strength and success.In the vanguard of the Ambassadors programPalafox was one of the program’s 31 original “Ambassadors for Literacy.” They went on to mentor more than 500 preschool-age children. In return for devoting time to their Head Start work, she and the other ambassadors received a powerful incentive. Dollars from the program would go into a college savings account to assist them if they decided to continue their education after high school.“We rewarded the students for being ambassadors and positive role models so that they could go on to higher education,” said Jini Puma, PhD, associate director of the Rocky Mountain Prevention Research Center (RMPRC) at the Colorado School of Public Health. Puma, a mentee of Watts as a student at DU, joined the Ambassadors for Literacy program in 2002.Watt’s original initiative was successful in spurring the young ambassadors to attend college, Puma said. Eighty-seven percent of those enrolled in the program went on to enroll in a four-year school. That compared with 22% of seniors graduating from DPS, she added.Taking a successful idea forwardPuma will now direct a new phase of Watt’s original idea, dubbed “Ambassadors for Literacy and Resilience.” A nearly $1 million donation gives a considerable boost to the effort. It includes training ColoradoSPH students to mentor the new generation of ambassadors, just as the ambassadors guide early childhood students at Head Start centers.“It’s a three-prong approach” to positive mentoring that proved successful in Watt’s original conception, Puma said. The new phase of the program includes hiring a program director, Joanna Coleman, who is bilingual in English and Spanish and has previous teaching experience. Among other responsibilities, Coleman will help to make connections with the school counselors and teachers who spot students with promise to be ambassadors, Puma said.“Joanna is doing all of our community engagement and outreach, recruiting families and leading training efforts” for ambassadors in literacy and social-emotional skill development, Puma said. Coleman will also work with graduate student mentors, track ambassadors’ hours and other tasks needed to keep the program on track, she added.Coleman will also have help from Palafox, who has come full circle from that first meeting with Watt. She worked as an ambassador through high school and continued her involvement while earning her undergraduate degree from DU in international business. She didn’t find that field fulfilling and decided her career path was in education. She went on to receive a master’s degree in counseling from Regis University and now is counselor to some 200 students at the Denver Center for 21st Century Learning, not far from her high school alma mater.Palafox now serves as an advisor to the new Ambassadors for Literacy and Resiliency program. In that role, she is working to identify students from her school who are promising candidates to help Head Start students, as she once did.The initial goal is to recruit five students from the Denver area to serve as ambassadors, Puma said. Further on the horizon, Puma hopes to expand the program to Weld County and the San Luis Valley.“Ultimately we aim to recruit the majority of students from rural areas because there are so fewer resources there,” she said.Long-term benefits of the Ambassadors programPalafox admits that as a ninth grader, she “wasn’t quite sure what the [Ambassadors] program was.” But years after the initially puzzling call to the principal’s office, she is clear about the benefits of the initiative.“It establishes a connection between early positive experiences with education for both Head Start students and the ambassadors,” Palafox said. “For the kids, it connects them to someone positive in a way that carries through their later years in schools. For the ambassadors, it builds self-efficacy and self-confidence that they are contributing to others in a meaningful way.”Puma said the results of Watt’s foundational work in resiliency bear out Palafox’s insights.“The number one factor was [resilient students] had a mentor or a trusted, caring adult in their lives,” Puma said. “It could be a coach, a teacher, a neighbor, but someone who took a real interest in a child’s success and was stable and loving and secure. That finding has been foundational in [the Ambassadors] program.”The strengthening of those type of bonds also has broad benefits for society, Puma believes.“The Ambassadors program addresses one of the social determinants of health, namely education access and quality,” she said. “It takes a multi-generational approach in doing so and [it also] addresses health equity…We know that for every year a person goes further with their education, their health outcomes are better.”On a personal level, Palafox recalls the first days of her ambassador training as an early glimpse at the possibility of a new life. Carrying a book bag of materials she would use with the Head Start kids, she strolled around the leafy DU campus. She was the first in her family to have the experience and opportunity.“It was the first time someone had talked to me in a way that [going to college] was a possibility,” Palafox recalled. “It was the first time it became tangible – because I saw it.”...cuanschutz.edu, 4d ago
After years in academia spent developing a very theoretical but powerful algorithm to tackle the analysis of enormous scientific datasets, he was looking for a place where he could match theory with real-world, large-scale science. He spent his time as a Ph.D. student at the University of Massachusetts Amherst developing algorithms to process massive, dynamically changing graphs. (Graphs, also called networks, are used to model connections between items, such as friendships among people, papers written by authors, and highways connecting cities.) The challenge was datasets that were so large they couldn’t be stored on a computer, and the algorithm needed to be able to determine patterns and results from the changing nature of the graph.Datanami, 4d ago
MEGHAN O'SULLIVAN, Director, Belfer Center “Henry Kissinger shaped, informed, and animated American foreign policy for decades. His influence was as significant outside government as it was inside, and his ability to impact events was as astonishing in his 100th year as it was in earlier decades, when he sat in both the White House and the State Department. I feel privileged to have gotten to know this extraordinary statesman, scholar, and strategist since we first met during the time of the Iraq war nearly twenty years ago. We at Harvard and in the world beyond will discuss and debate his consequential and sometimes controversial legacy for a long time to come, which is as it should be. Today, the world is facing an era of great power competition and wars in both Europe and the Middle East. This geopolitical landscape seems to beg for the best of Kissinger’s scholarship and practice. Few better understood the dynamics and dangers of great power politics than Kissinger. He used this knowledge to produce outcomes that changed the course of history, both in the U.S. recognition of China and in détente between the United States and the Soviet Union. In the Middle East, fifty years ago, Kissinger’s vigorous and creative diplomacy helped allow peace to emerge from violence. As we face current challenges that have more than echoes of the past, we will be able to draw on the lessons – both positive and negative – from Kissinger’s career and scholarship. Kissinger’s legacy can also be more personal in nature. For me, I will remember his endless intellectual curiosity and willingness to take on complex issues – such as artificial intelligence – late in life and take that memory as an exhortation to never shy from a steep learning curve.”...Belfer Center for Science and International Affairs, 4d ago

Top

...“The FOCS 2013 paper changed the landscape of theoretical cryptography,” CIS Lab Senior Scientist Abhishek Jain said. “It gave the first strong evidence that program obfuscation – long thought to be an impossible goal – could be realized securely. This had two profound effects on the research community. First, a change in belief about the existence of obfuscation became a powerful catalyst that motivated cryptographers to revisit the foundations of program obfuscation and realize it from well-founded assumptions. After nearly a decade, this effort finally culminated in a recent breakthrough work.”...NTT Research -, 28d ago
Pricing consists of four different areas: analysis, decisioning, deployment and monitoring. Technology has changed each of those in different ways. With insurance pricing, analysis is understanding the risk with the likely cost of claims. It's also about understanding policyholder behavior on personal lines. Today, insurers worldwide have at their disposal a very rich, powerful toolkit of machine learning models that can very quickly, very easily produce highly predictive models. With that powerful prediction comes issues with interpretability, because a lot of these models are quite hard to understand. That can be a big problem in insurance for two reasons. Firstly, unlike marketing or other functions in insurance or other industries, where it's okay to have an 80-20 model, if you misprice insurance business, you can lose a lot of money very quickly. When things change, as they did during the COVID pandemic, some insurers are sharp at noticing that. Others are slow. Some relied on clever machine learning models calibrated pre-COVID that weren't as good post-COVID.There's a wall of regulatory issues to be tackled. There's 50 flavors of U.S. regulation. There's now quite a bit of pricing regulation to adhere to. For that, not only understanding what your models are doing, but also being able to explain them is really important. There's been less change in deployment from a technological perspective. But given all this modeling, it's important to scenario test what you want to do, and what's the best thing for the business in underwriting, pricing or other actions in portfolio management. It's important to construct a calculation that predicts as accurately as possible what might happen. WTW developed proprietary machine learning models that are interpretable by design. We have patents pending on interpretable machine learning models that fit just as predictive models, but are transparent in that you can see which factors explain the risk and the behavior, explain very clearly what's going on, are much more transparent and manage the models much better as a result. Once you decide what to do with pricing structures, claims, underwriting rules and case triaging rules, that has to be deployed into the real world, into a policy administration system. Increasingly now, technology has enabled very complex things to be done at the point of the sale. But perhaps most importantly, maybe helped by the adoption of cloud computing, many systems out there are much more interoperable, and play more nicely with APIs, allowing calls from one system to another, and a componentized approach. That enables the deployment of deep analytics, undiluted by errors, without the costly process, so you can now deploy much more quickly. You can respond very quickly to developments in the market.By actually analyzing whether it matters, we can identify proactively, if something needs attention, and if it does need attention, automatically to identify perhaps why. So that allows managing models to happen more easily, but also just wider portfolio management. This automation removes the dross from an expert's life, and enables them to do what they're good at, which is thinking and bringing insurance experience to bear. None of this is about replacing the expert. It's about empowering the expert. The insurers that will win are those that embrace technology developments and analytical tools, and use them effectively, understanding insurance and keeping their eye on the ball. Problems from not spotting inflation or models gone wrong come from inexperience in insurance management, overreliance on models and approaches that were not fit for purpose. Empowering the experts so they can be experts, is a big theme that we mustn't forget.Digital Insurance, 18d ago
This innovative design offers a sustainable, safe, and high-energy-density alternative to conventional lithium-ion batteries, addressing the limitations of material scarcity and safety concerns. Recently featured in Science Advances under the title "Next-generation magnesium-ion batteries: The quasi-solid-state approach to multivalent metal ion storage", the new Mg-ion battery has the potential to revolutionize the industry. "It is a game-changing development," said Professor Leung. In recent years, Mg-ion batteries have emerged as a potential solution in light of lithium-ion batteries' limitations. However, the road to developing efficient Mg-ion batteries has been fraught with challenges, including the need to overcome the narrow electrochemical window in aqueous or water-based systems, and the poor ionic conductivity in non-aqueous systems. Addressing these obstacles, Professor Leung's team developed a water-in-salt Mg-ion battery with an operating voltage above 2 V. Yet, it still lags behind non-aqueous counterparts due to the dominance of proton over Mg-ion storage in the cathode. "Hydrogen ions, or protons, are smaller and lighter compared to the metal ions. Because of their size, protons can easily get into the battery's cathode structure. However, this creates a problem because protons and Mg ions compete for space, which severely limits how much energy the battery can store and how long it can last," said Sarah Leong, a PhD student in Professor Leung's team and the study's first author. The tireless efforts of the team finally bore fruit, however, with the introduction of the quasi-solid-state magnesium-ion battery (QSMB), an innovative battery design that uses a polymer-enhanced electrolyte to control the competition between protons and metal ions. QSMB boasts an impressive voltage plateau at 2.4 V and an energy density of 264 W·h kg⁻¹, surpassing the performance of current Mg-ion batteries and almost matching the performance of Li-ion batteries. Professor Leung stressed: "Our quasi-solid-state magnesium-ion battery combines the best of both worlds, offering the high voltage of non-aqueous systems and the safety and cost-effectiveness of aqueous systems. It represents a major step forward in the development of high-performance magnesium-ion batteries." To put the QSMB to the ultimate test, the research team conducted extensive cycling tests, with astonishing results. Even under extreme conditions of subzero temperatures (-22°C), the QSMB retained an impressive 90% of its capacity after 900 cycles. The battery is also non-flammable and resistant to pressure over 40 atmospheric pressure. This level of durability and performance makes the QSMB a promising candidate for consumer electronics, even in colder climates. Dr Wending Pan, a Research Assistant Professor in Professor Leung's team, believes the QSMB technology has the potential to reshape the landscape of energy storage and power our world sustainably. He said: "The advanced electrolyte development strategy presented in our research holds potential beyond magnesium-ion batteries, extending to other multivalent metal ion batteries, such as zinc-ion and aluminium-ion batteries. We believe that this study will pave the way for the next generation of energy storage solutions that are not only efficient but also environmentally friendly." Link to the paper: https://www.science.org/doi/10.1126/sciadv.adh1181 Hashtag: #HKU...SME Business Daily Media, 19d ago
The Higgs mechanism is a part of the Standard Model of quantum mechanics that allows certain kinds of particles to have nonzero mass. In spite of its great importance, there is no rigorous proof that the Higgs mechanism can indeed generate mass in situations that are relevant for the Standard Model. In technical terms, this corresponds to the “coupling parameter” of the model being small, and the "gauge group” being non-Abelian (the most important cases are SU(2) or SU(3)). I will present the first rigorous proof in this direction, showing that SU(2) lattice Yang-Mills theory coupled to a Higgs field transforming in the fundamental representation of SU(2) has a mass gap at any value of the coupling parameter, provided that the interaction of the Higgs field with the gauge field is strong enough. No background is needed.Institute for Advanced Study, 18d ago
Crucially, the team demonstrates that even in an apparently dark cavity, α-RuCl3 senses modifications of the electromagnetic environment and changes its magnetic state accordingly. This is a purely quantum mechanical effect, arising from the fact that within quantum theory the empty cavity (technically called the vacuum state) is never really empty. Instead, the light field fluctuates so that light particles pop in and out of existence which, in turn, affects the properties of the material.SciTechDaily, 26d ago
PROJECTTEAMNFT LANDSDEXANOMICSRESEARCHNEWSCONTACT USEnglishEnglishIt’s born: in your eyes, under your skin. It’s the immersiverse.Reality vs. Metaverse: a challenge that sees on one side a space of Known Experiences, where we usually live, and a new space go Ignote, full of possibilities.Watch Video12OPEN THE GATEAt Dexagon we want to open the Gate to the Virtual Life, revealing a new way of approaching the virtual world. It’s a new approach that involves all the senses, bringing you in a complete different experience: the immersiverse.A brief History1993The word “Metaverse” was coined by Neal Stephenson in his 1992 novel “Snow Crash”2003Second Life starts to spread the concept of a new universe where evolve an avatar.2013High Fidelity is a virtual reality platform for users to create, deploy, visit, and interact with virtual worlds along with other users.2023Real Estate Business Opportunities finally starts.A NEW ECOSYSTEMAs more of our lives are spent online, it’s becoming harder and harder to distinguish “real” life from life lived digitally. for this reason we are creating A NEW REALITY AS THE REALITY.A new ExperienceA Metaverse created as an environment that is always online, with all platforms and/or worlds connected and open to all.Landing to a New WorldA place where our digital and physical lives converge; creativity is limitless; and location-defying worlds bring people together.Easy to AccessA fully immersive experience where your avatar is ready to live fluidly and with no limitations of actions.Hyper RealisticHigh resolution and detailed reality will create the realistic environment to multiply the perceptions and the experience.A whole new StrategyDeveloping the urban criteria, guidelines and masterplanning with the aim to generate differentiation and create value.Urban Real EstateThe properties will be organized and distributed along the masterplan in order to create a real estate environment with prices and scale of investments to satisfy a wider range of buyers.The mechanism of urban development will generate revenue for the investors and will be based on off-plan sale strategy.Ultra DesignNatural ScenographyVirtuous CircleThe main idea is to create value by radial circles where the center hosts the properties with higher prices and the periphery is lower.In this way it starts to attract different types of buyers and investors and can activate a virtuous circle.Masterplan ConceptThe new magnificent world.The citiesAdaptive polycentric cities.Housing themesStunning design. Experience the luxury of an amazing natural environment.Non-stop research and developmentDexagon has been working on creating something unique for years. This gives life to a high-level research activity which also involved the development of cutting-edge devices and solutions for the use of the metaverse in its fullness.Discover MoreBlogDexagon’s DXC Token Now Listed on CoinMarketCap!Dexagon’s DXC Token Now Listed on CoinMarketCap!by [email protected] | Nov 8, 2023 | NewsIn a significant stride towards broader recognition and accessibility, Dexagon is thrilled to announce that its DXC token has been officially listed on CoinMarketCap.The Nexus of Innovation: Bitcoin Tower in Dexagon’s ImmersiverseThe Nexus of Innovation: Bitcoin Tower in Dexagon’s Immersiverseby Dexagon | Oct 2, 2023 | NewsDexagon’s groundbreaking Diogene headset is transcending the borders between reality and virtuality, creating an immersive and interactive world where sensations and perceptions intertwine. This revolutionary innovation is more than a gateway to different realms; it’s a symbiosis of immersive experiences combined with sophisticated brain-reading technology, redefining the limits of what is achievable in the integration of our senses and the digital world.Expanding Horizons: Bridging Realities with Dexagon’s DiogeneExpanding Horizons: Bridging Realities with Dexagon’s Diogeneby Dexagon | Sep 28, 2023 | NewsDexagon’s groundbreaking Diogene headset is transcending the borders between reality and virtuality, creating an immersive and interactive world where sensations and perceptions intertwine. This revolutionary innovation is more than a gateway to different realms; it’s a symbiosis of immersive experiences combined with sophisticated brain-reading technology, redefining the limits of what is achievable in the integration of our senses and the digital world.« Older EntriesA brief GlossaryInteroperabilityWeb 3.0PersistenceDAODecentralizationExtended realityNFTBlockChainsDigital twinsLiminal spacesVirtual teleportationAre you interested in this new world?Please contact us, and you’ll get the answer to your questions directly from our experts.Contact Us!Follow UsLegalPrivacy PolicyCopyright © 2023 Dexagon io | All Rights Reserved...CryptoLinks, 26d ago

Latest

A potential avenue for future research is to explore how incorporating shape features can improve the performance of a controller, particularly in terms of precise manipulation and generalization to new shapes. It may be worth investigating the use of visual inputs for training, which could address the limitations of current reinforcement learning controllers that rely on full-state information simulation. Finally, comparative studies with prior works could help contextualize the findings in the existing literature, and dexterous manipulation using open-source hardware warrants further investigation.MarkTechPost, 4d ago
Based on his calculations, Tipler determined that a civilization limited to a modest fraction of the speed of light (10%) could accomplish this within just 650,000 years—long before life and human civilization arose on Earth. Given the fact that no evidence of any civilization existed (what Hart called "Fact A") means that there were no ETCs and humanity was alone in the universe. In 1980, physicist and cosmologist Frank Tipler took things further in his paper "Extraterrestrial Intelligent Beings Do Not Exist," where he employed refined calculations and the Copernican Principle.phys.org, 4d ago
The release last year of OpenAI’s flagship ChatGPT engine—which quickly became one of the most popular software launches of all time—seems to have crystallized divisions within the company. As Karen Hao and Charlie Warzel wrote in The Atlantic last week, the launch “sent OpenAI in polar-opposite directions, widening and worsening the already present ideological rifts.” (In a 2019 email to staff, Altman referred to “tribes” at the company.) A source told Hao and Warzel that, once it became obvious how quickly ChatGPT was growing, OpenAI “could no longer make a case for being an idealistic research lab,” because “there were customers looking to be served here and now.” Within a matter of months, OpenAI had two million users, including many Fortune 500 companies, and had signed an investment deal with Microsoft for thirteen billion dollars in funding. Just before he was ousted, Altman was said to be trying to raise funds to start a computer chip company that could supply OpenAI’s insatiable demand for computing power.Columbia Journalism Review, 4d ago
University relations offers programs such as the Thought Leader Award, granted to Prof. Dwight Stoll of the Gustavus Adolphus College in Saint Peter, Minnesota, in 2016. The award funded a project to investigate the role of 2D-LC in biopharmaceutical analyses involving contributions from several researchers in academia and industry;9 note the temporal correlation of the release of the ASM technology. A personal highlight this year was the same award going to Prof. Peter Neubauer at the Technical University Berlin.10 His goal is to orchestrate hard- and software, automation, mathematical models and AI to discover optimal conditions for process analytical technology (PAT) workflows. I am looking forward to watching this project unfold over the next few years.Technology Networks, 4d ago
The Tribal Government’s Ecosystem Conservation Office is dedicated to the conservation and stewardship of the vital marine resources upon which Unangan culture and subsistence depends. One ongoing aspect of their work is the disentanglement of laaqudan from marine debris. Laaqudan entangled in debris—such as packing bands and fishing net fragments— have been observed in the Pribilofs since the 1930s. Due to the islands’ location in the central Bering Sea and local current patterns, a great deal of marine debris accumulates both in the waters surrounding and on the Pribilof Islands. Animals may become entangled either near the islands during breeding season, or during their winter or spring migrations. Entanglement greatly reduces a fur seal’s chance of survival. It can restrict movement or cause injury as debris cuts into the skin over time, particularly for young animals that continue to grow after becoming entangled.NOAA, 4d ago
...claims aren't actually science itself. When your only method is to ask people what products they use, if they feel sad, angry, or have a disease, and then correlating the product you wanted to target to the malady, it is easy to understand why during COVID-19 disease epidemiologists had a difficult time getting traction - they had never stood up to the cranks at Harvard School of Public Health and National Institute of Environmental Health Sciences using food surveys to try and scare people about everything.Epidemiology is the basis for a new exploratory paper which claims that siloxanes, which help to create shiny, smooth hair, are increased if you straighten or curl your hair, and that might have health implications.Scared of this image? I can narrow the list of media corporations you read, I can hone in on your food preferences, and certainly how you vote. The experiment is fine for what it is. They set up a tiny house and had people style their hair while they measured cyclic volatile methyl siloxanes using mass spectrometry. Unsurprisingly, they increased. Higher temperatures caused higher amounts, as did more hair. So cut your hair short and don't curl it? Or...what? If I tell you your chances of winning the lottery are 1 in 235,000,000 but if you give me money it will drop to 1 in 234,000,000, you aren't giving me money.Yet if I claim that spending more for a product that does not contain something will increase your chances of not getting a disease, with no relative risk included, a whole lot of people who buy organic food, supplements, and alternatives to medicine will spend more. In the case of California, the government will force poor people to spend more, with no health or environmental benefit.The presence of a pathogen does not mean pathology, except on an IARC Working Group where they exist to scare everyone about everything. In the real world, of science and natural laws, the dose makes the poison. The volatile organic compounds they're detecting in higher quantity still can't harm you unless are are styling your hair 140 times per day for 300 years. And are a rat.The authors are so intent on scaring the public they say you can't even open a window, because then you are releasing it into the atmosphere where it does...what? Decamethylcyclopentasiloxane(D5) they cite specifically doesn't do much at all and is gone in 5 or 6 days, unless the whole planet spends all their days - wait, nights, because nature makes it nearly 300 percent more prevalent at night - heating up their hair.This will be fodder for the activists who are against cosmetics but isn't really informing the public about hazard or risk. It is just suggesting a potential hazard in rats in any amount is risky. That's a way to troll for an expert witness contract from a trial lawyer, not science.Science 2.0, 4d ago

Top

Exascale computing’s promise rests on the ability to synthesize massive amounts of data into detailed simulations so complex that previous generations of computers couldn’t handle the calculations. The faster the computer, the more possibilities and probabilities can be plugged into the simulation to be tested against what’s already known — how a nuclear reactor might respond to a power failure, how cancer cells might respond to new treatments, how a 3D-printed design might hold up under strain.The process helps researchers target their experiments and fine-tune designs while saving the time and expense of real-world testing. Scientists from around the world compete for time on Frontier through DOE’s Innovative and Novel Computational Impact on Theory and Experiment, or INCITE, program.“The bigger the idea, the bigger the simulation,” Atchley said. “A machine of Frontier’s power can let you tackle exponentially larger problems because it can do the math faster than any other machine.”Simulating the universe? Exascale allows for not just more planets and stars but whole galaxies. Simulating quantum mechanics? A machine like Frontier allows for more particles. Simulating climate or weather? Frontier allows global modeling at a size, scale and level of accuracy over a longer time frame than ever before.“It’s like the difference between ‘Donkey Kong’ and ‘Grand Theft Auto,’” Atchley said. “Because Frontier is so much faster, we can perform simulations in minutes, hours or days that would take years or even decades to complete on other machines — which means they wouldn’t be done.”Scientists initially questioned whether exascale computing could be done at all. The discussion jumped from theoretical to practical in 2008 after the Roadrunner supercomputer at Los Alamos National Laboratory achieved petascale speed with a run clocked at 1 petaflop, or 1 quadrillion calculations per second. The OLCF’s Jaguar logged nearly double that speed a year later.Could high-speed computing make the leap to the next order of magnitude? Not likely, experts warned.“Just one major challenge threatened to be a showstopper,” said Al Geist, an ORNL corporate fellow and chief technology officer for the Frontier project. “We identified four, and all of them would have to be overcome — power consumption, reliability, data movement and parallelism.”...ornl.gov, 20d ago
...to electromagnetic interference (EMI), generates no heat throughout the cable, and signals cannot be poached off fiber optics as they can easily be done in copper-based systems. Also, fiber optics is future- proof. The physics of light do not change, so the fiber optics being used today will essentially be the same a hundred years from now. Additionally, fiber optics has a fraction of the weight of copper. This is crucially important for aerospace applications where engineers work for days simply to save a pound of mass. All these favorable attributes are readily acknowledged; however, there remains a bias against using fiber optics in harsh environments. Most of our everyday lives are not lived in such settings; our homes and offices are protected places where less vigorous equipment is used. Nevertheless, there are many applications where electronic equipment is exposed to more robust conditions that include temperature extremes, physical shock, high vibrations, intense electrical interference, atmospheric pressure fluctuations, water immersion, corrosive chemical exposure and many other harsh conditions that have the potential to disable the system that these electronics are intended to service. Such systems are most vulnerable at the interconnection devices. Fiber optics is the perfect solution for high-reliability, high-performance, and quality signal integrity in extreme environments like agriculture, mil/aero, and space applications, where space and weight is limited. At first glance, fiber optics may seem unsuitable for such applications. After all, a single mode optical fiber is just 9 micrometers (microns) of glass that is cladded (essentially insulated) up to 125 microns and then protected with a similar sized coating, typically up to 250 microns. For comparison’s sake, a human hair is...connectorsupplier.com, 20d ago
..."That put more pressure on realizing this capability," says Pang.According to NOAA, a rip current is a "powerful, narrow channel of fast-moving water." It pulls unsuspecting swimmers into deep water, where they risk fatigue and drowning after trying to fight the current. Santa Cruz Marine Safety reports 10 drowning deaths in the past decade, including two in 2023. The National Weather Service ranks rip currents as the third most dangerous of all weather hazards, just behind heat and flooding.Rip currents can be difficult to detect from shore and sometimes appear unexpectedly. Pang's team explored many different methods of rip detection and ultimately decided to use a machine-learning-based system similar to the obstacle detection systems used in self-driving cars. Machine learning is a type of artificial intelligence that describes the ability of a machine to make decisions based on information it has been given. Scientists showed their rip current detector a collection of images, some with rip currents and some without, to train the system to recognize the common attributes of a rip current. After training, the detector can find rip currents in live video streams.GovTech, 6d ago