Skip to main content

Threads profile for Darrell Ulm

I've recently taken the step of joining Threads as Darrell Ulm ( https://www.threads.com/@darrell_ulm ),  as I embark on a journey to relearn and expand my existing knowledge in areas like artificial intelligence. My current focus involves delving deeper into the intricacies of AI, particularly exploring the fascinating world of Large Language Models (LLMs) and understanding how these sophisticated models are developed and utilized. I'm also revisiting the fundamentals of Neural Networks, the core building blocks that enable AI systems to learn and make predictions. Given the computational demands of these fields, I'm also keen on extending the principles and applications I previously learned in parallel processing, which plays a crucial role in efficiently handling the complex computations involved in AI.

Darrell R. Ulm

Popular posts from this blog

Research Papers, Darrell Ulm on Microsoft Academic Site

Microsoft used to host a link to its Academic Search platform, but that site has since been taken offline. The link originally pointed to Darrell Ulm’s research on parallel algorithms, data‑parallelism, and high‑performance computing. It’s a bit surprising that Microsoft retired the publication portal, because it was a genuinely valuable hub for technical papers and academic references.

Paper by Darrell Ulm: Virtual Parallelism by Self Simulation of the Multiple Instruction Stream Associative Model

The CiteSeer entry: "Virtual Parallelism by Self Simulation of the Multiple Instruction Stream Associative Model" (1995), Darrell Ulm. Research paper deals with relative power of the MASC model when simulating itself, and the algorithmic overhead to simulate. Virtual Parallelism by Self Simulation of the Multiple Instruction Stream Associative Model, Darrell Ulm Tumblr , Wordpress