24 and AI

GEEK FREE
By Joe Callison
22 January, 2024

24 and AI

It appears that 2024 will be the year that artificial intelligence (AI) goes mainstream. Microsoft is readying the planned 24H2 Windows 11 update for the debut of AI throughout the operating system for computer systems with neural processing units (NPU) in addition to the central processing unit (CPU) and graphics processing unit (GPU). These new computer systems are beginning to be released now and by mid-year will probably be shipping with AI-enabled Windows 11 pre-installed on them. Upgrades to incorporate at least some AI in existing Windows 11 systems should be available in the fall with the release of 24H2. Microsoft is hinting at future requirements for full implementation of AI that may include a minimum of 16GB of RAM and neural processing capability of at least 40 trillion operations per second (TOPS). This level of performance will be available in new Intel Core Ultra and AMD 8000 Series processors that will show up in computers later this year, which is expected to boost sales of new computers.

AI-enabled computing can greatly speed up search, analysis, or generation of huge amounts of data. Concerns about results generated by AI include the quality, validity, or potential bias. This will be heavily dependent on how the AI model is built and trained by humans. Like many advancements in technology, it can be used for good or it can (and will) be used for evil.

I have done some experimenting with the ChatGPT and Bing Chat and have found it useful in research-type internet searches, and appreciate how it includes the references for the information presented. I framed some questions to test for bias or assumptions and did not find any evidence of it. Both sides of controversial issues were typically presented in the results. ChatGPT even responded that it is not programmed to make assumptions but only to use facts. It is still up to the user to evaluate the validity of the material presented. Just because the majority of articles in the databases searched state or reference something does not make it valid. Authors of articles tend to cite data or views they agree with and the quantity of references citing the same data or views can be biased by popular opinion more than facts. This may be a problem if people depend too heavily on AI for things such as medical self-diagnosis for example.

Posted by Joe Callison

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.