Danbooru
Login Posts Comments Notes Artists Tags Pools Wiki Forum More »
Listing Upload Hot Changes Help

Search

  • Help
guro
scat
furry -rating:g

Artist

  • ? nefula14 32

Copyright

  • ? blue archive 312k

Characters

  • ? akane (blue archive) 1.9k
  • ? ↳ akane (bunny) (blue archive) 1.1k

General

  • ? 1girl 6.7M
  • ? animal ears 1.3M
  • ? aqua bow 12k
  • ? aqua bowtie 6.7k
  • ? black-framed eyewear 43k
  • ? blue bow 135k
  • ? blue bowtie 38k
  • ? bow 1.3M
  • ? bowtie 356k
  • ? breasts 3.9M
  • ? brown halo 814
  • ? brown shawl 810
  • ? detached collar 147k
  • ? fake animal ears 172k
  • ? glasses 407k
  • ? halo 330k
  • ? leotard 237k
  • ? official alternate costume 403k
  • ? plaid clothes 151k
  • ? plaid shawl 496
  • ? playboy bunny 117k
  • ? rabbit ears 256k
  • ? shawl 30k
  • ? solo 5.6M
  • ? strapless 179k
  • ? strapless leotard 70k
  • ? traditional bowtie 11k
  • ? watermark 157k
  • ? white leotard 33k

Meta

  • ? adversarial noise 4.3k
  • ? highres 6.1M
  • ? ↳ absurdres 2.1M
  • ? revision 65k

Information

  • ID: 8462642
  • Uploader: Seventhcomet »
  • Date: 8 months ago
  • Size: 1.86 MB .jpg (2920x4096) »
  • Source: twitter.com/lita_illust/status/1859051981647868040 »
  • Rating: Sensitive
  • Score: 13
  • Favorites: 15
  • Status: Deleted

Options

  • Resize to window
  • View smaller
  • View original
  • Find similar
  • Download

History

  • Tags
  • Pools
  • Notes
  • Moderation
  • Commentary

This post was deleted for the following reason:

Unapproved in three days (8 months ago)
This post belongs to a parent (learn more) « hide
post #8443608
Resized to 29% of original (view original)
akane (blue archive) drawn by nefula14

Artist's commentary

  • Original
  • #ブルアカイラスト部 再掲

    • ‹ prev Search: user:Seventhcomet status:deleted next ›
  • Comments
  • Recommended
  • Loading...

    Dikkus
    8 months ago
    [hidden]

    Still cant get used to these rather ugly anti-AI measures all these artists are implementing

    7 Reply
    • Copy ID
    • Copy Link
    T34-38
    8 months ago
    [hidden]

    Dikkus said:

    Still cant get used to these rather ugly anti-AI measures all these artists are implementing

    It's basically defacing at this point. Most think it will stop "AI" but at the end only the person that controls it may or may not be.

    5 Reply
    • Copy ID
    • Copy Link
    die301
    8 months ago
    [hidden]

    Fucking dot!!! idiot artist!

    -8 Reply
    • Copy ID
    • Copy Link
    anon7631
    8 months ago
    [hidden]

    T34-38 said:

    It's basically defacing at this point. Most think it will stop "AI" but at the end only the person that controls it may or may not be.

    Given that the basic concept (heavily simplified) of AI image-gen is that you hand a denoising tool a canvas of nonsense and see what it hallucinates the "real" image under the noise to be, I doubt it would be a significant detour to also train denoising tools on common forms of adversarial noise, to clean up their training set. Not perfect, since in the end the image is being irrevocably defaced, but good enough. It'd just need the same tools put to a different use. I remember seeing extensions for AI frontends to handle common varieties like Glaze a year and a half ago. And that's assuming the adversarial noise works as well as its authors claim in the first place.

    The ugly watermarks won't do anything because in large-scale AI efforts, there's probably not going to ever be a human who even sees it among the training set; at best it'll get included anyway, and at worst images detected as being watermarked will be passed through an automated removal tool, which will be imperfect but good enough. A hobbyist making a LoRA can use more precise semi-automated watermark removal tools with manual touch-up, because it's small enough training sets for that to be viable.

    In the end, most "effective" aspect of these measures is that if an artist reduces his work to nothing but ugly garbage that isn't worth looking at, then people won't bother paying the artist's work enough attention to even want to make LoRAs. But if someone does care enough to go through all the preprocessing, the irony is that what comes out of the LoRA is probably going to look a lot better than the real artist's work with all the sabotage like this thing.

    6 Reply
    • Copy ID
    • Copy Link
    Jnglmpera
    8 months ago
    [hidden]

    T34-38 said:

    It's basically defacing at this point. Most think it will stop "AI" but at the end only the person that controls it may or may not be.

    Agreed. As much as I avoid posting those watermarked images since it ruins them for me, it's unlikely Elon or Jack or other big tech companies would care not to scrape those because "artist put things on their art"

    4 Reply
    • Copy ID
    • Copy Link
    Saladofstones
    8 months ago
    [hidden]

    Isn't this going to have limited effect since they still have the absolutely massive trove of scraped data from before anyone knew this was a thing?

    7 Reply
    • Copy ID
    • Copy Link
    T34-38
    8 months ago
    [hidden]

    Saladofstones said:

    Isn't this going to have limited effect since they still have the absolutely massive trove of scraped data from before anyone knew this was a thing?

    I was told by someone that AI can still process data from it. Worse since the idiots use the same watermark image, AI can just easily be process after dozen of artwork that bare the same watermark... In short their efforts are completely pointless.

    5 Reply
    • Copy ID
    • Copy Link
    cd young
    8 months ago
    [hidden]

    Jnglmpera said:

    Agreed. As much as I avoid posting those watermarked images since it ruins them for me, it's unlikely Elon or Jack or other big tech companies would care not to scrape those because "artist put things on their art"

    People can opt out of grok using their material for learning. I feel most antis didn't read that far into the agreement.

    0 Reply
    • Copy ID
    • Copy Link
    anon7631
    8 months ago
    [hidden]

    cd_young said:

    People can opt out of grok using their material for learning. I feel most antis didn't read that far into the agreement.

    People can opt out of grok, but the only way to opt out of AI entirely is by not uploading the art to the internet. Making something public inevitably means losing some control. Twitter has no opt-out or ToS condition for art being reuploaded from there to Danbooru, but that doesn't stop people. And neither did this image's "do not reupload" watermark.

    Edit: That said, I think you're right, even though people are downvoting. It does seem to be a common misunderstanding among anti-AI artists that Twitter's forcing it and they need to go elsewhere to avoid it, which is doubly wrong, for both of the reasons we state.

    Updated by anon7631 8 months ago

    5 Reply
    • Copy ID
    • Copy Link
    matteste
    8 months ago
    [hidden]

    T34-38 said:

    It's basically defacing at this point.

    Yea, there are already several good pieces I have avoided cause of how dreadful these effects are.

    4 Reply
    • Copy ID
    • Copy Link
    Terms / Privacy / Upgrade / Contact /