Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

intersection brushstroke question #47

Open
Miata1z opened this issue Nov 21, 2021 · 7 comments
Open

intersection brushstroke question #47

Miata1z opened this issue Nov 21, 2021 · 7 comments

Comments

@Miata1z
Copy link

Miata1z commented Nov 21, 2021

Hello, Sir. I am trying to physically recreate the "sketch" that made the model (markerpen). I chose the markerpene because in the process it will be convenient to decompose it into vectors/points (pic 1). I am going to paint with acrylic/oil, but this type of paint(acrylic/oil) cannot physically replicate it (pic 2, I called effect - intersection brushstroke). I cannot ignore it because it will affect the final result of the drawn (I mean, because of these intersections, a picture is formed)

Can I teach a model to rule it out? Perhaps changing it to a new smear? Perhaps I missed something and there is a ready-made solution? Thanks!
image

image
Pic 1

image
Pic 2

@jiupinjia
Copy link
Owner

Hi @Miata1z, the physical implementation seems to be an interesting idea!

To remove the semi-transparent effect, you can modify line 293 and line 314 of networks.py to the follows:
if self.rdrr.renderer in ['oilpaintbrush', 'airbrush', 'markerpen']:

Please tell me whether it works. Thanks!

@Miata1z
Copy link
Author

Miata1z commented Nov 24, 2021

Hello! The modifications that you suggested to me did not help me (I am attaching a video before / after modification networks.py). At first I was delighted when I saw how the result was rendered in the G_pred window, although at the end the model produced something completely different :) (Pic 1). Hence the question, the G pred generation window did not take into account the edits that I made to the file? (networks.py)
Yes, the intersection brushstroke effect still persists. (Pic 2)

While I settled on the oilbrush model, but ran into a gradient problem (pic 3), it would be difficult to repeat physically :)
(I changed the brush sprite for the oilbrush to make it easier to compute the brush stroke later :) )(Pic 4)

The question is off topic. Could you tell me how to save generation strokes separately? I mean... (Pic 5)

Thanks!

face_rendered_stroke_0500
Pic 1

image
Pic 2

image
Pic 3

image
Pic 4
image
Pic 5
Generation was performed with the parameter --keep_aspect_ratio True (default False)

rendering video (original networks.py)
https://youtu.be/6uU8g-wyd2E

face animated (original networks.py)
https://youtu.be/ECYNNnTnUzU

rendering video (modified networks.py)
https://youtu.be/l3WN1DdJM78

face animated (modified networks.py)
https://www.youtube.com/watch?v=wdo0NLU67o4

output files (png strokes, etc)
modified : https://drive.google.com/file/d/1j6GoNqs_zHuGt5t0-CuBJW6fYDfhhKm_/view?usp=sharing
original: https://drive.google.com/file/d/1HkUwQuFO0b1LX7ORpUyFFpO7tLFjzhCf/view?usp=sharing

@jiupinjia
Copy link
Owner

@Miata1z, sorry it was my bad. You can ignore my last suggestion and update the line 64 and line 94 of demo_prog.py to:
pt.x_alpha.data = torch.clamp(pt.x_alpha.data, 0.99, 1)

This will constraint the alpha value of the strokes close to 1.0 both in the optimization and the final rendering phases.

Please let me know whether it helps this time.

@jiupinjia
Copy link
Owner

@Miata1z By the way, I think your idea of physical painting with a robot arm is amazing. I really want to see if you can achieve that. Would you like to share the physical painting results with me when it is ready? No matter it succeeds or not will be interesting.

@Miata1z
Copy link
Author

Miata1z commented Nov 26, 2021

Of course I will. In turn, I want to say that without your repository nothing would have happened :)

@Miata1z
Copy link
Author

Miata1z commented Nov 27, 2021

pt.x_alpha.data = torch.clamp(pt.x_alpha.data, 0.99, 1)
Thanks. That helped. Result (face animated/strokes) https://drive.google.com/file/d/1vyaX9WcVYSJw0C3IqoT806EakZ-vIUDF/view?usp=sharing

If you don't mind, I have a couple more questions :)
Could you tell me how to save generation strokes separately?
image

I don't think this will affect the result, but is it possible to somehow remove these artifacts?
image

As you already know in what follows, I want to transfer the coordinates of the strokes to the plotter. Perhaps I'm complicating it and your algorithm already contains coordinates that I can convert and send to the plotter?

Thanks!

@jiupinjia
Copy link
Owner

@Miata1z, good questions are sorry for the late reply.

For the step-by-step strokes, you can find them in the ./output folder. If you load the stroke vector file ./output/*_strokes.npz correctly, you will see the step-by-step parameters. You can refer to line 377-386 of ./painter.py for more details.

For the configuration of the stroke coordinates, you can refer to the supplementary material of our paper.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants