Rules Update - For a short summary on changes, check Here
Welcome! Please check this thread

Viewing last 25 versions of post by Adorable Blue Fox in topic Opening Thread

Adorable Blue Fox

Got any cookies?
[@Admin](/forums/meta/topics/opening-thread?post_id=311#post_311)

I can collect all the `ai content` image comments; I have the script ready.

There are an estimated 29,381 images in total, with at least one comment on the first 317 pages (50 images per page) sorted on comment count.

I will collect the images with a 5 second delay between each request to avoid rate-limiting and server performance issues.

So, it'll take approximately 23 hours to process all the images with comments.

For now, I will store the collected comments until Philomena's development allows for importing comments. When that capability is available, we can decide whether to assign all comments to anonymous users, considering that most users who initially made the comments won't exist on this site.

I have my dev environment for Philomena, so I'll likely work on a solution for this. No promises at this stage, though.

Let me know whether to proceed or if you need any details from me.

---

EDIT:

I just looked at the nightly dump and found that the comments are part of it. Technically, the script to save all the comments via API is useless, and I can do it locally.

The only difference between the API and the dump is the saving of the user profile picture at that time, but if the comments are going to be posted anonymously or something else entirely, then it wouldn't matter.

EDIT 2:

All the comments are now exported, and there are a total of 50,548 comments on images with the tag `ai content` and 42,435 comments on `ai generated`. I will have to export again before January 6, 2025, to get the most updated information.
Reason: EDIT 2
Edited by Adorable Blue Fox
Adorable Blue Fox

Got any cookies?
[@Admin](/forums/meta/topics/opening-thread?post_id=311#post_311)

I can collect all the `ai content` image comments; I have the script ready.

There are an estimated 29,381 images in total, with at least one comment on the first 317 pages (50 images per page) sorted on comment count.

I will collect the images with a 5 second delay between each request to avoid rate-limiting and server performance issues.

So, it'll take approximately 23 hours to process all the images with comments.

For now, I will store the collected comments until Philomena's development allows for importing comments. When that capability is available, we can decide whether to assign all comments to anonymous users, considering that most users who initially made the comments won't exist on this site.

I have my dev environment for Philomena, so I'll likely work on a solution for this. No promises at this stage, though.

Let me know whether to proceed or if you need any details from me.

---

EDIT:

I just looked at the nightly dump and found that the comments are part of it. Technically, the script to save all the comments via API is useless, and I can do it locally.

The only difference between the API and the dump is the saving of the user profile picture at that time, but if the comments are going to be posted anonymously or something else entirely, then it wouldn't matter.

EDIT 2:

All the comments weare now exported, and there are a total of 50,548 comments today. I will have to export again before January 6, 2025, to get the most updated information.
Reason: EDIT 2
Edited by Adorable Blue Fox
Adorable Blue Fox

Got any cookies?
[@Admin](/forums/meta/topics/opening-thread?post_id=311#post_311)

I can collect all the `ai content` image comments; I have the script ready.

There are an estimated 29,381 images in total, with at least one comment on the first 317 pages (50 images per page) sorted on comment count.

I will collect the images with a 5 second delay between each request to avoid rate-limiting and server performance issues.

So, it'll take approximately 23 hours to process all the images with comments.

For now, I will store the collected comments until Philomena's development allows for importing comments. When that capability is available, we can decide whether to assign all comments to anonymous users, considering that most users who initially made the comments won't exist on this site.

I have my dev environment for Philomena, so I'll likely work on a solution for this. No promises at this stage, though.

Let me know whether to proceed or if you need any details from me.

---

EDIT:

I just looked at the nightly dump and found that the comments are part of it. Technically, the script to save all the comments via API is useless, and I can do it locally.

The only difference between the API and the dump is the saving of the user profile picture at that time, but if the comments are going to be posted anonymously or something else entirely, then it wouldn't matter.

EDIT 2:

All the comments were exported, and there are 50,548 comments today. I will have to export again before January 6, 2025, to get the most updated information.
Reason: EDIT 2
Edited by Adorable Blue Fox
Adorable Blue Fox

Got any cookies?
[@Admin](/forums/meta/topics/opening-thread?post_id=311#post_311)

I can collect all the `ai content` image comments; I have the script ready.

There are an estimated 29,381 images in total, with at least one comment on the first 317 pages (50 images per page) sorted on comment count.

I will collect the images with a 5 second delay between each request to avoid rate-limiting and server performance issues.

So, it'll take approximately 23 hours to process all the images with comments.

For now, I will store the collected comments until Philomena's development allows for importing comments. When that capability is available, we can decide whether to assign all comments to anonymous users, considering that most users who initially made the comments won't exist on this site.

I have my dev environment for Philomena, so I'll likely work on a solution for this. No promises at this stage, though.

Let me know whether to proceed or if you need any details from me.

---

EDIT:

I just looked at the nightly dump and found that the comments are part of it. Technically, the script to save all the comments via API is useless, and I can do it locally.

The only difference between the API and the dump is the saving of the user profile picture at that time, but if the comments are going to be posted anonymously or something else entirely, then it wouldn't matter.
Reason: comment count
Edited by Adorable Blue Fox
Adorable Blue Fox

Got any cookies?
[@Admin](/forums/meta/topics/opening-thread?post_id=311#post_311)

I can collect all the `ai content` image comments; I have the script ready.

There are an estimated 29,381 images in total, with at least one comment on the first 317 pages (50 images per page) sorted on comment count.

I will collect the images with a 5 second delay between each request to avoid rate-limiting and server performance issues.

So, it'll take approximately 23 hours to process all the images with comments.

For now, I will store the collected comments until Philomena's development allows for importing comments. When that capability is available, we can decide whether to assign all comments to anonymous users, considering that most users who initially made the comments won't exist on this site.

I have my dev environment for Philomena, so I'll likely work on a solution for this. No promises at this stage, though.

Let me know whether to proceed or if you need any details from me.
Reason: comment count
Edited by Adorable Blue Fox