16.02.2025 21:34
3

Gift box with AI

As I have written before, I have a resin printer. With its help, I sometimes bring some ideas to life. One of these was a gift box for earrings.

3 pairs of earrings were made in the form of wings, so the idea was to make the box in a similar style. Through experimentation, the following pipeline was established: prompt generation (chatgpt) > image generation based on these prompts (local flux dev) > 3D model generation (tripo3d).

Prompt Generation

If you don't know exactly what you want or want variability - chatGPT can easily help generate the needed ideas in text form. DALL-E is currently not competitive enough to use it directly for image generation, so we only ask chatGPT for text.

I wanted a heart-shaped box where the halves would be wings.

Request:

I want to make a gift box for earrings (3 pairs), the box should be in the shape of a heart, made up of two wings, top view, so it would be easier for me to convert it to 3D for printing.
I need a prompt for flux that will generate a top view of this box.

Response:

Top-down view of a heart-shaped box, where the heart is formed by two large, symmetrical wings. The wings curve inward, meeting at the center to form the pointed bottom of the heart. The wings have subtle feather-like textures, with lines running along the surface to suggest individual feathers. The surface of the wings is smooth and gently curved. The heart's edges are clean and sharp, and the center line where the wings meet is distinct but seamless. Inside the heart, visible from the top, there are three evenly spaced compartments, each designed to hold a pair of earrings. The overall design should be sleek and minimal, with clear symmetry between the wings. Soft lighting highlights the contours and textures of the wings, and the color is a soft, neutral tone such as white or light grey.

Actually, such a long prompt isn't really necessary for us, as flux understands regular text thanks to its built-in encoder.

Image Generation

So, why flux? Those familiar with AI content probably don't need an explanation of why DALL-E isn't competitive - it has its own unique, non-realistic, semi-cartoonish style. For example, this prompt rendered in DALL-E:

dall-e version

Overall okay, but the view isn't from above. The generation time is tolerable, but there are limits. With a locally hosted flux that can be left running for several hours to generate lots of variants, I don't see much point in bothering chatGPT.

We load the prompt into flux and ask it to generate 300 images. Why 300? Because I can. One generation takes about 1 minute, so it took around 5 hours in total.

flux dev results

And this is just over a third of all variants. On one project, which I'll write about in separate articles - this isn't even the limit; I generated 1500 images there, though with much more variability.

I chose one that I liked the most:

flux best result

3D Model

AI can help here too at the time of writing this article. tripo3d (rodin) is probably the best solution on the market at the moment. Simply go to the website, upload the image, and in a few minutes, you get a ready 3D model:

tripo3d gif

But the devil is in the details, and now at high magnification, artifacts are visible (they are largely a consequence of using a low-quality image):

tripo3d artefacts

And this is now - I made the box six months ago when tripo3d hadn't yet updated to the current version. Back then it looked even worse (using another image as an example):

tripo3d old result

While this might be sufficient for an fdm printer, it's too noisy a model for resin printing - all these artifacts would be visible on it.

There was also an attempt to generate a normal map from the image to feed into blender, but img2img normal map generators are even more terrible than img2model. The best I found looked like this:

normal map

Modeling

Therefore, I decided to do without AI tools for modeling at this stage. I used the original image from flux as a normal map. The result wasn't exactly like the 3D renders, but the texture still came out interesting and recognizable:

blender top 3d model

Then we smooth out the roughness, do some manual refinement with sculpting in blender, model the lid and box with cutouts for earrings in solidworks (since then I've switched to fusion360) in such a way that they don't rattle around inside and don't fall out.

Result

Then we print, remove supports, and send it to the solarium:

diy wash and cure

That's not all - we sand it, varnish it, and get the final version:

result

No comments yet

Latest articles