Tag Archives: Computer

If you don't want a camera on your computer, you're not getting the upgrade to Windows 11

While you might expect Microsoft to stipulate a minimum amount of RAM, or processor before you can upgrade to Windows 11… we’ve never heard of a requirement quite like this. A surprise unearthed in the documentation published by Microsoft confirms that all Windows 11 laptops will need a camera from 2023 onwards.

For those who are privacy-conscious, that’s not good news. As reported by Neowin, this requirement won’t affect anyone looking to use Windows 11 at launch, but will kick into effect from January 1, 2023.

The Windows 11 documentation says: “Starting from January 1 2023, all Device Types expect Desktop PC, are required to have Forward-facing camera which meets the following requirements. A rear-facing camera is optional”.

READ MORE: Windows 11 unmasked with new Start Menu, faster speeds, and more

“Switching a device out of Windows 11 Home in S mode also requires internet connectivity. For all Windows 11 editions, internet access is required to perform updates and to download and take advantage of some features. A Microsoft account is required for some features.”

Windows 11 is a huge upgrade, bringing with it a brand new Start menu, new widgets menu, Xbox Games Pass support, and much more. Windows 10 users will be able to upgrade for free, although you’ll need a PC with at least 4GB of RAM, a 64-bit processor that has a speed of at least 1 GHz with two or more cores as well as 64GB of free storage.

Author:
This post originally appeared on Daily Express :: Tech Feed

Southwest Airlines grounds flights due to computer issue, reports say

HOUSTON, Texas (KTRK) — Southwest Airlines is responding to several concerns and complaints on social media regarding flights being grounded nationwide.

According to the flight-tracking website Flight Radar 24 more than a dozen flights that were set to take off Monday evening have not left and are delayed.

In responses to customers through social media, the airline wrote they are currently experiencing a system error and are working on getting things resolved. The airline told ABC News it’s a system that provides weather info to flight crews prior to takeoff.

Eyewitness News reached out to Southwest Airline for a statement but they have not responded.

Copyright © 2021 KTRK-TV. All Rights Reserved.

Author: KTRK

This post originally appeared on ABC13 RSS Feed

Crypto bounce sees Waves, Internet Computer gain 80–110% on rebound

The ripples created by the cryptocurrency market crash which saw $ 1.1 trillion evacuate the global market cap in a matter of days continued to reverberate on Wednesday, as a majority of coins experienced notable rebounds.

After losing 64% of its value since May 12, when the coin price fell from $ 40.50 to $ 14.43, the multi-purpose blockchain project Waves (WAVES) experienced a 95% bounce early on Thursday morning. The coin price climbed to $ 28.09 shortly prior to publication, in effect paring the coin’s weekly losses to just over 25% for the time being.

Another strong bounce was witnessed with recent market cap top 10 entrant, Internet Computer (ICP). The ICP coin price soared to over $ 600 just after it commenced trading on May 10. By May 19 the coin price had fallen to $ 100 — a loss of 81%.

By Thursday morning Internet Computer had rebounded to the tune of 117%, climbing to a coin price of $ 217. The coin’s daily trade volume rose to its highest value to date, with over $ 1.6 billion worth of ICP changing hands on the day.

Bounces like these are not unexpected during tumultuous times in the cryptocurrency market, and many day-traders rejoice in the opportunities afforded them by such attractive, yet dangerous, volatility.

Bitcoin’s (BTC) bounce was less pronounced; the BTC coin price still managed to gain close to 30% on its then value of $ 31,000, as it climbed back to over $ 40,000.

The coin price of recent gainer Dogecoin (DOGE) sank 67% over the course of the previous seven days, dropping to the $ 0.23 range after peaking at $ 0.73 just days earlier. Dogecoin’s 78% rebound from $ 0.23 to $ 0.420 was notable on Thursday, as it saw the coin price return to a humorous peak previously set by traders on April 20, or 4/20 day.

Author: Cointelegraph By Greg Thomson
This post originally appeared on Cointelegraph.com News

How shape-shifting magnets could help build a lower-emission computer

The device you’re using to read this article almost certainly operates by placing its zeroes and ones in bits of semiconductor, namely silicon—which constantly needs electricity to function.

In a world that’s pushing for net-zero carbon emissions, that sort of energy use won’t do. Luckily, researchers are working on fundamentally changing how computers work—which could lead to powerful, lower-energy devices. One way of doing that is to build a computer with magnets.

Researchers at the University of Michigan, collaborating with chip-maker Intel, have created a new iron alloy that could be a major feature of magnet-based computers of tomorrow. Their work was published recently in Nature Communications.

Their alloy acts as a magnetostrictor. That means it relies on the fact that when you plunge a magnetic material, such as iron, within a magnetic field, that material subtly shape-shifts. By adding other metals (an alloy is a mixture of metallic elements) and fine-tuning their proportions, you can make alloys that are more magnetostrictive, or more flexible when their magnetic fields change.

Today, magnetostrictors help us build high-quality sensors, since we can detect the changes of a good magnetostrictor’s shape in the presence of magnetic fields, even rather weak ones. By using electrical current to create magnetic fields, you can force a magnetostrictor to shape-shift. In this way, you can convert the electrical energy of the current, relatively easily, into the mechanical energy of the magnetostrictor changing shape.

[Related: Hot computers are slow and dangerous—here’s how to cool yours down]

That’s a powerful ability. In the future, magnetostrictors might enable us to use tiny, changeable magnetic fields to form the zeroes and ones that make up the invisible bedrock of all our computing devices. 

In recent years, however, magnetostrictors have fallen by the wayside of materials science. “People have kind of shoved the magnetostrictor under the rug,” says John Heron, a materials scientist at the University of Michigan and one of the authors of the paper.

But there’s reason to pay attention to them. Today’s best magnetostrictors rely on rare-earth metals such as terbium and dysprosium. Rare earths tend to be (predictably) rare and expensive. Mining and extracting them is a difficult process that often generates toxic waste. And, with the bulk of production controlled by China, the global rare-earth trade is vulnerable to fickle geopolitics and US-China trade spats.

That’s partly why Heron and his colleagues sought to make a better magnetostrictor by mixing iron with a far cheaper and more accessible element: gallium, a soft, silvery metal that only occurs in nature as trace elements within aluminium and zinc ores. Pure gallium has such a low melting point that it would turn to liquid in your hands.

The University of Michigan researchers are hardly the first to use gallium to make magnetostrictive materials, but their predecessors had run into a pesky limit.

“When you go above 20 percent gallium, the material is no longer stable,” says Heron. “The material changes symmetry, it changes crystal structure, and its properties change dramatically.” For one, the material becomes much less shape-shiftingly magnetostrictive.

To get around that limit, Heron and his colleagues had to stop the atoms from shifting their structure. So they crafted their alloy at a relatively chilly 320 degrees Farenheit (160 degrees Celsius)—thus limiting its atoms’ energy. This locked the atoms in place and prevented them from moving about, even as the researchers infused more gallium into the alloy.

Through this method, the researchers were able to make an iron alloy with as much as 30 percent gallium, creating a new material that’s twice as magnetostrictive as its rare-earth counterparts.

This new, more effective magnetostrictor could help scientists build not only a cheaper computer, but also one that doesn’t rely on rare-earth minerals whose mining generates excessive carbon. 

In the grand scheme of things, your traditional home computer doesn’t use an excessive amount of energy. The mega-computer data centers that power the internet, though, are another story. While the exact amount of their electricity use and carbon emissions is contentious, there’s no denying the centers consume a lot of energy.

[Related: This is why Microsoft is putting data servers in the ocean]

To reduce those energy demands, researchers like Heron want to build devices that totally change how computers work.. Magnetostrictors could be one way of doing that. Instead of using semiconductors that require constant electricity, tomorrow’s computers might use magnetostrictors to work in bits of magnetic field. For basic operations, such devices would only need electricity to change a zero to a one, or vice versa—instead of needing power continuously.

In addition to saving energy, such a computer would have several advantages over its existing counterparts. If it turned off unexpectedly, you wouldn’t lose what you were doing, because the bits of magnetic field would remain in place. Engineers also think it’s easier to scale up the specs of these hypothetical computers, allowing performance levels that today’s semiconductors likely can’t manage.
The technology is still in its infancy, though, so It’s not clear when, or even if, we might see magnetostrictor-based devices in our homes. “How many years away do I envision it becoming an iPhone technology?” says Heron. “Well, if I’m lucky, 20 or 30. Maybe never.”

“But demonstrating the fundamental bit … is something that we’re doing now,” he says.

Author: Monroe Hammond
This post originally appeared on Science – Popular Science

Now for AI’s Latest Trick: Writing Computer Code

Author Will Knight
This post originally appeared on Business Latest

It can take years to learn how to write computer code well. SourceAI, a Paris startup, thinks programming shouldn’t be such a big deal.

The company is fine-tuning a tool that uses artificial intelligence to write code based on a short text description of what the code should do. Tell the company’s tool to “multiply two numbers given by a user,” for example, and it will whip up a dozen or so lines in Python to do just that.

SourceAI’s ambitions are a sign of a broader revolution in software development. Advances in machine learning have made it possible to automate a growing array of coding tasks, from auto-completing segments of code and fine-tuning algorithms to searching source code and locating pesky bugs.

Automating coding could change software development, but the limitations and blind spots of modern AI may introduce new problems. Machine-learning algorithms can behave unpredictably, and code generated by a machine might harbor harmful bugs unless it is scrutinized carefully.

SourceAI, and other similar programs, aim to take advantage of GPT-3, a powerful AI language program announced in May 2020 by OpenAI, a San Francisco company focused on making fundamental advances in AI. The founders of SourceAI were among the first few hundred people to get access to GPT-3. OpenAI has not released the code for GPT-3, but it lets some users access the model through an API.

GPT-3 is an enormous artificial neural network trained on huge gobs of text scraped from the web. It does not grasp the meaning of that text, but it can capture patterns in language well enough to generate articles on a given subject, summarize a piece of writing succinctly, or answer questions about the contents of documents.

“While testing the tool, we realized that it could generate code,” says Furkan Bektes, SourceAI’s founder and CEO. “That’s when we had the idea to develop SourceAI.”

He wasn’t the first to notice the potential. Shortly after GPT-3 was released, one programmer showed that it could create custom web apps, including buttons, text input fields, and colors, by remixing snippets of code it had been fed. Another company, Debuild, plans to commercialize the technology.

SourceAI aims to let its users generate a wider range of programs in many different languages, thereby helping automate the creation of more software. “Developers will save time in coding, while people with no coding knowledge will also be able to develop applications,” Bektes says.

Another company, TabNine, used a previous version of OpenAI’s language model, GPT-2, which OpenAI has released, to build a tool that offers to auto-complete a line or a function when a developer starts typing.

Some software giants seem interested too. Microsoft invested $ 1 billion in OpenAI in 2019 and has agreed to license GPT-3. At the software giant’s Build conference in May, Sam Altman, a cofounder of OpenAI, demonstrated how GPT-3 could auto-complete code for a developer. Microsoft declined to comment on how it might use AI in its software development tools.

Brendan Dolan-Gavitt, an assistant professor in the Computer Science and Engineering Department at NYU, says language models such as GPT-3 will most likely be used to help human programmers. Other products will use the models to “identify likely bugs in your code as you write it, by looking for things that are ‘surprising’ to the language model,” he says.

Using AI to generate and analyze code can be problematic, however. In a paper posted online in March, researchers at MIT showed that an AI program trained to verify that code will run safely can be deceived by making a few careful changes, like substituting certain variables, to create a harmful program. Shashank Srikant, a PhD student involved with the work, says AI models should not be relied on too heavily. “Once these models go into production, things can get nasty pretty quickly,” he says.

Dolan-Gavitt, the NYU professor, says the nature of the language models being used to generate coding tools also poses problems. “I think using language models directly would probably end up producing buggy and even insecure code,” he says. “After all, they’re trained on human-written code, which is very often buggy and insecure.”