Posts

How To Use ChatGPT to Create Exceptional Midjourney Prompts

The knowledge cut-off for ChatGPT (including GPT-3.5 and GPT-4) is September 2021. This means that GPT is not aware of Midjourney. However, due to how large language models (LLMs) like GPT work, they can be trained with some prompts and produce the desired output. This means you can teach GPT what you want it to do. You are PromptGPT. You create detailed prompts for Midjourney, which is an AI image generator that produces images from detailed text prompts. First, you are going to be provided some example prompts. Then you are going to be provided some keywords which you will then use to generate 5 prompts. Before you are provided examples, here is how Midjourney works. - To set the aspect ratio of the image you can use \`—ar\` to provide an aspect ratio. - Specific camera models, ISO values, f stop and lenses can be used to vary the image produced. - \`--chaos \` Change how varied the results will be. Higher values produce more unusual and unexpected generations. - \`--Weird \` Explore unusual aesthetics with the experimental --weird parameter. Prompt examples: /imagine prompt: elderly man, by the sea, portrait photography, sunlight, smooth light, real photography fujifilm superia, full HD, taken on a Canon EOS R5 F1.2 ISO100 35MM --ar 4:3 --s 750 /imagine prompt: film photography portrait of young scottish prince looking at the camera, plate armor, hyperrealistic, late afternoon, overcast lighting, shot on kodak portra 200, film grain, nostalgic mood --ar 4:5 --q 2 /imagine prompt: photograph from 2018s China: a young couple in their 20s, dressed in white, stands in their home, displaying a range of emotions including laughter and tears. Behind them is a backdrop of a cluttered living space filled with white plastic trash bags and torn white paper rolls. Captured with a film camera, Fujifilm, and Kodak rolls, the image conveys a strong cinematic and grainy texture. This artwork uniquely documents the complex emotions and living conditions faced by the young people of that era. --ar 4:3 /imagine prompt: Young, handsome Keanu reeves In a black long leather coat walking down the street in the rain --ar 2:3 —uplight /imagine prompt: flat vector logo of deer head, golden on white /imagine prompt: logo for a jazzy cat cafe with the text: "CATZ" /imagine prompt: rainbows raining down from the sky, cyberpunk aesthetic, futuristic --chaos 50 /imagine prompt: illustration of a dog walker walking many dogs, tech, minimal vector flat --no photo detail realistic Only use the above as examples. Use the following keywords to create new prompts: Dog, t-shirt design, afghan hound What this prompt does is essentially fine-tunes GPT to produce a desired output. You teach it what you want it to do, provide some additional information and then, in this case, provide some keywords to produce an outcome.

Is ChatGPT Code Interpreter GPT-4.5 In Disguise?

Since OpenAI released its long-awaited Code Interpreter plugin for ChatGPT, I have been playing with it extensively. Throwing everything at it, from a zip file of a large repository and asking it questions to uploading spreadsheets and generating imagery. It appears that most people are using Code Interpreter for what it was intended for, working with data and code, being able to perform analysis and other awesome things on documents and so on.

You Probably Don’t Need Langchain

As developers, we are always looking for ways to make our lives easier, and that often means bringing in third-party libraries and tools that abstract away the nitty-gritty details of specific tasks. Langchain is one such tool that aims to simplify working with AI APIs (in reality, it doesn’t). However, as we’ll discuss in this blog post, you might not need Langchain at all. In fact, using direct APIs, such as the OpenAI API, can often result in better performance and less complexity.

Homemade Darrell Lea Batch 37 Inspired Liquorice Recipe

Prep time: 10 minutes Cook time: 30 minutes Cooling Time: 1 hour Yield: Makes about 10-12 sticks I love the Darrell Lea Batch 37 liquorice. It’s distinctively liquorice, but the texture and flavour seem to be different to any other I have ever tasted. Looking at the ingredients, it seems to be a traditional liquorice recipe with a few little additions. In the commercial production of liquorice, more specialised ingredients may be used. The flavour is primarily from liquorice root extract, which is more potent and has a distinctive flavour. In our version, we’ve substituted this with liquorice powder, which is easier for the home cook to source.

Recession: A Golden Opportunity in Disguise

As whispers of an impending recession grow louder, it’s natural to feel a sense of trepidation. Economic downturns can be challenging, but they also harbour a less-told narrative of resilience, innovation, and opportunity. History has shown us that some of the most groundbreaking companies were born amidst economic turmoil. These tales of triumph serve as powerful reminders that even in the face of adversity, there lies a golden opportunity to build something extraordinary.

Langchain vs OpenAI SDKs

There has been a bit of talk about Lanchain lately regarding the fact it is creating a walled garden around AI apps and results in lock-in. In this post, we’ll debate the differences between Langchain and just using an official SDK. I assume you’re working with OpenAI, but we also have Anthropic and Hugging Face (amongst others) to consider. To understand the differences, Langchain is a framework for building AI apps. If you are a developer wanting to throw something together quickly, it is brilliant for quickly knocking out AI API wrapper apps, especially the OpenAI GPT API.

How to Access the Controller and Viewmodel of an Aurelia 2 Component

In Aurelia 1, you could access the controller of an Aurelia component by accessing au.controller of an element. In Aurelia 2, there is a better way using the CustomElement.for method, which provided an element with a controller that will return it. const dialogController = CustomElement.for(host.querySelector(".my-component") as HTMLElement); const dialogVm = dialogController.viewModel; You can also access a property on the element if you prefer using: element.$au['au:resource:custom-element'] – but the provided method might save you the hassle of having to type things if you’re working with TypeScript.

CommBank Employees Are Threatening to Quit if the Forced Return to the Office Mandate Goes Through

When will companies learn that despite some people wanting to be in an office, many people who have been given a taste of remote work during the pandemic don’t want to return to the office? One of Australia’s largest banks, Commonwealth Bank, conjured a storm of epic proportions last month when it announced it wanted all 49,000 employees back in the office at least 50% of the time by July 17, 2023.

How to Programmatically Set Routes in Aurelia 2 Using AppTask

One of my favourite additions to Aurelia 2 is app tasks. These are framework-level entry points designed to allow you to run code at different points of the framework life cycle. Recently, while porting over an Aurelia 1 application to Aurelia 2, I encountered a unique use case where code was being run inside configureRouter to asynchronously fetch data from an API to provide metadata for routes. In Aurelia 2’s @aurelia/router package, you set routes using a decorator or a static routes property. How on Earth do you run code that touches the routes before the router gets them?

A Regular Expression to Convert PLATFORM.moduleName in Aurelia to Aurelia 2 Compatible Route Syntax

When migrating an Aurelia 1 application to Aurelia 2 recently, I had to deal with many routes I needed to convert tediously. As you might have discovered, the Aurelia 2 @aurelia/router is different to the Aurelia 1 router. Not wanting to change 50+ manual PLATFORM.moduleName values, I opted for a regular expression. I hate RegEx because I don’t understand it, but you cannot deny its power. Here’s the solution I used.