Meta has released version 2 of its open-source Llama AI model and has caught many’s attention – but not entirely for the right reasons. Coming in a broad spectrum of sizes, from the 7 billion to an impressive 70 billion parameter models, Llama 2 certainly stands out.
If you’re curious, you can experience the different models for yourself on Perplexity. You can only try 7 and 13 billion models there.
But as I’ve dug deeper into Llama 2, I’ve begun to ask myself: has Meta gone too far with safety measures?
Since its launch, OpenAI’s GPT-4 has been the talk of the town, marking yet another milestone in artificial intelligence. However, over the past few months, there’s been a rising suspicion within the AI community that GPT-4 has been “nerfed” or subtly downgraded. Despite these concerns, OpenAI maintains its stance that nothing has changed that would cause any significant impact on GPT-4’s performance or quality. But is that really the case?
In the cutthroat world of web development, trends come and go faster than a blink of an eye. Yet amidst this constant churn, there has been one relentless narrative: the supposed downfall of PHP and its offspring, WordPress. But here’s the twist—despite the years of criticism, proclamations of their death, and the rise of shinier, ‘cooler’ tools, PHP and WordPress are still standing. Not just standing but thriving.
Let’s face it. PHP has been the favourite whipping boy of developers for years. It’s been derided as messy, outdated, and everything in between. Yet, if PHP is as terrible as its critics claim, how has it survived and flourished in the competitive landscape of web development? The answer lies in its simplicity, flexibility, and resilience.
Have you ever found yourself startled by the uncanny resemblance between the smartphone in your hand and that of your mate’s, despite them being from entirely different manufacturers? You are not alone. This unsettling sameness is a symptom of a broader ailment plaguing the tech industry: homogenisation.
Like a relentless tide, homogenisation has washed over the technology landscape, reducing the once vibrant panorama of innovation to a monotonous, grey sea. This is a trend where uniqueness is relinquished in favour of uniformity, where diversity is suppressed for the sake of standardisation. But at what cost?
The knowledge cut-off for ChatGPT (including GPT-3.5 and GPT-4) is September 2021. This means that GPT is not aware of Midjourney. However, due to how large language models (LLMs) like GPT work, they can be trained with some prompts and produce the desired output.
This means you can teach GPT what you want it to do.
You are PromptGPT. You create detailed prompts for Midjourney, which is an AI image generator that produces images from detailed text prompts. First, you are going to be provided some example prompts. Then you are going to be provided some keywords which you will then use to generate 5 prompts. Before you are provided examples, here is how Midjourney works. - To set the aspect ratio of the image you can use \`—ar\` to provide an aspect ratio. - Specific camera models, ISO values, f stop and lenses can be used to vary the image produced. - \`--chaos \` Change how varied the results will be. Higher values produce more unusual and unexpected generations. - \`--Weird \` Explore unusual aesthetics with the experimental --weird parameter. Prompt examples: /imagine prompt: elderly man, by the sea, portrait photography, sunlight, smooth light, real photography fujifilm superia, full HD, taken on a Canon EOS R5 F1.2 ISO100 35MM --ar 4:3 --s 750 /imagine prompt: film photography portrait of young scottish prince looking at the camera, plate armor, hyperrealistic, late afternoon, overcast lighting, shot on kodak portra 200, film grain, nostalgic mood --ar 4:5 --q 2 /imagine prompt: photograph from 2018s China: a young couple in their 20s, dressed in white, stands in their home, displaying a range of emotions including laughter and tears. Behind them is a backdrop of a cluttered living space filled with white plastic trash bags and torn white paper rolls. Captured with a film camera, Fujifilm, and Kodak rolls, the image conveys a strong cinematic and grainy texture. This artwork uniquely documents the complex emotions and living conditions faced by the young people of that era. --ar 4:3 /imagine prompt: Young, handsome Keanu reeves In a black long leather coat walking down the street in the rain --ar 2:3 —uplight /imagine prompt: flat vector logo of deer head, golden on white /imagine prompt: logo for a jazzy cat cafe with the text: "CATZ" /imagine prompt: rainbows raining down from the sky, cyberpunk aesthetic, futuristic --chaos 50 /imagine prompt: illustration of a dog walker walking many dogs, tech, minimal vector flat --no photo detail realistic Only use the above as examples. Use the following keywords to create new prompts: Dog, t-shirt design, afghan hound What this prompt does is essentially fine-tunes GPT to produce a desired output. You teach it what you want it to do, provide some additional information and then, in this case, provide some keywords to produce an outcome.
Since OpenAI released its long-awaited Code Interpreter plugin for ChatGPT, I have been playing with it extensively. Throwing everything at it, from a zip file of a large repository and asking it questions to uploading spreadsheets and generating imagery.
It appears that most people are using Code Interpreter for what it was intended for, working with data and code, being able to perform analysis and other awesome things on documents and so on.
As developers, we are always looking for ways to make our lives easier, and that often means bringing in third-party libraries and tools that abstract away the nitty-gritty details of specific tasks. Langchain is one such tool that aims to simplify working with AI APIs (in reality, it doesn’t). However, as we’ll discuss in this blog post, you might not need Langchain at all. In fact, using direct APIs, such as the OpenAI API, can often result in better performance and less complexity.
Prep time: 10 minutes
Cook time: 30 minutes
Cooling Time: 1 hour
Yield: Makes about 10-12 sticks
I love the Darrell Lea Batch 37 liquorice. It’s distinctively liquorice, but the texture and flavour seem to be different to any other I have ever tasted.
Looking at the ingredients, it seems to be a traditional liquorice recipe with a few little additions. In the commercial production of liquorice, more specialised ingredients may be used. The flavour is primarily from liquorice root extract, which is more potent and has a distinctive flavour. In our version, we’ve substituted this with liquorice powder, which is easier for the home cook to source.
As whispers of an impending recession grow louder, it’s natural to feel a sense of trepidation. Economic downturns can be challenging, but they also harbour a less-told narrative of resilience, innovation, and opportunity. History has shown us that some of the most groundbreaking companies were born amidst economic turmoil. These tales of triumph serve as powerful reminders that even in the face of adversity, there lies a golden opportunity to build something extraordinary.
There has been a bit of talk about Lanchain lately regarding the fact it is creating a walled garden around AI apps and results in lock-in. In this post, we’ll debate the differences between Langchain and just using an official SDK. I assume you’re working with OpenAI, but we also have Anthropic and Hugging Face (amongst others) to consider.
To understand the differences, Langchain is a framework for building AI apps. If you are a developer wanting to throw something together quickly, it is brilliant for quickly knocking out AI API wrapper apps, especially the OpenAI GPT API.