Machine Generated Data
Tags
Color Analysis
Face analysis
Amazon
Microsoft
![](https://ids.lib.harvard.edu/ids/iiif/496746811/914,608,65,86/full/0/native.jpg)
AWS Rekognition
Age | 7-17 |
Gender | Female, 92.5% |
Angry | 59.7% |
Calm | 37.5% |
Surprised | 6.5% |
Fear | 6% |
Sad | 2.4% |
Confused | 0.8% |
Disgusted | 0.4% |
Happy | 0.2% |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1218,131,57,82/full/0/native.jpg)
AWS Rekognition
Age | 16-22 |
Gender | Female, 100% |
Calm | 52.4% |
Sad | 22.4% |
Surprised | 19.3% |
Fear | 6.7% |
Confused | 4.3% |
Angry | 3.9% |
Disgusted | 3.4% |
Happy | 1.2% |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/86,411,70,87/full/0/native.jpg)
AWS Rekognition
Age | 13-21 |
Gender | Female, 100% |
Calm | 93.2% |
Surprised | 6.7% |
Fear | 6% |
Sad | 3.6% |
Angry | 1% |
Confused | 0.4% |
Disgusted | 0.2% |
Happy | 0.1% |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1398,228,65,75/full/0/native.jpg)
AWS Rekognition
Age | 22-30 |
Gender | Female, 100% |
Sad | 100% |
Calm | 11.9% |
Surprised | 7% |
Fear | 6% |
Angry | 1.9% |
Disgusted | 0.9% |
Confused | 0.8% |
Happy | 0.2% |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/368,319,63,81/full/0/native.jpg)
AWS Rekognition
Age | 13-21 |
Gender | Male, 98.2% |
Calm | 92% |
Surprised | 6.3% |
Fear | 5.9% |
Sad | 4.3% |
Angry | 2.1% |
Confused | 0.2% |
Disgusted | 0.1% |
Happy | 0.1% |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/782,39,57,81/full/0/native.jpg)
AWS Rekognition
Age | 23-31 |
Gender | Female, 58.1% |
Calm | 53% |
Sad | 23.8% |
Surprised | 11.7% |
Fear | 10.8% |
Happy | 6.4% |
Angry | 1.4% |
Confused | 1.4% |
Disgusted | 1.3% |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1195,271,63,86/full/0/native.jpg)
AWS Rekognition
Age | 16-24 |
Gender | Female, 97.7% |
Calm | 76.6% |
Sad | 28.1% |
Surprised | 7.2% |
Fear | 6% |
Confused | 0.5% |
Angry | 0.3% |
Disgusted | 0.2% |
Happy | 0.1% |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/2105,116,63,71/full/0/native.jpg)
AWS Rekognition
Age | 16-22 |
Gender | Female, 99.8% |
Sad | 99% |
Calm | 30.3% |
Happy | 7.5% |
Surprised | 6.8% |
Fear | 6.1% |
Disgusted | 0.7% |
Angry | 0.7% |
Confused | 0.6% |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/2230,1042,93,111/full/0/native.jpg)
AWS Rekognition
Age | 13-21 |
Gender | Female, 50.8% |
Calm | 65.9% |
Confused | 12.5% |
Surprised | 9.2% |
Fear | 6.4% |
Angry | 5.1% |
Happy | 4.7% |
Sad | 3.1% |
Disgusted | 2.8% |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/407,760,69,85/full/0/native.jpg)
AWS Rekognition
Age | 18-26 |
Gender | Male, 99.6% |
Calm | 68.3% |
Sad | 10.2% |
Fear | 8.3% |
Confused | 8% |
Surprised | 7.6% |
Angry | 1.7% |
Happy | 1.6% |
Disgusted | 1% |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1780,1459,99,97/full/0/native.jpg)
AWS Rekognition
Age | 20-28 |
Gender | Female, 99.8% |
Calm | 53.9% |
Angry | 29.4% |
Sad | 8.4% |
Surprised | 6.9% |
Fear | 6.5% |
Confused | 1.8% |
Disgusted | 1.3% |
Happy | 0.8% |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/101,1119,91,94/full/0/native.jpg)
AWS Rekognition
Age | 6-14 |
Gender | Female, 96.8% |
Angry | 43.5% |
Calm | 20% |
Surprised | 12.6% |
Happy | 11.2% |
Fear | 7.3% |
Sad | 7.3% |
Confused | 3% |
Disgusted | 1.1% |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1458,788,62,77/full/0/native.jpg)
AWS Rekognition
Age | 4-12 |
Gender | Male, 68.9% |
Sad | 88.6% |
Calm | 58.3% |
Surprised | 6.3% |
Fear | 5.9% |
Confused | 0.2% |
Angry | 0.1% |
Happy | 0.1% |
Disgusted | 0.1% |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/898,198,57,78/full/0/native.jpg)
AWS Rekognition
Age | 13-21 |
Gender | Female, 98.1% |
Calm | 77.7% |
Angry | 13.5% |
Surprised | 7.1% |
Fear | 6.3% |
Sad | 3.7% |
Happy | 1.3% |
Confused | 0.6% |
Disgusted | 0.5% |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1504,1328,86,92/full/0/native.jpg)
AWS Rekognition
Age | 26-36 |
Gender | Male, 87.8% |
Calm | 95.2% |
Surprised | 6.4% |
Fear | 5.9% |
Sad | 2.7% |
Happy | 0.8% |
Confused | 0.7% |
Angry | 0.6% |
Disgusted | 0.5% |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/427,50,63,83/full/0/native.jpg)
AWS Rekognition
Age | 35-43 |
Gender | Male, 99.5% |
Calm | 98.8% |
Surprised | 6.3% |
Fear | 5.9% |
Sad | 2.4% |
Angry | 0.1% |
Disgusted | 0% |
Happy | 0% |
Confused | 0% |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/2219,516,74,85/full/0/native.jpg)
AWS Rekognition
Age | 6-14 |
Gender | Female, 100% |
Sad | 100% |
Surprised | 6.3% |
Fear | 5.9% |
Calm | 0.1% |
Happy | 0% |
Angry | 0% |
Confused | 0% |
Disgusted | 0% |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/211,1483,76,89/full/0/native.jpg)
AWS Rekognition
Age | 12-20 |
Gender | Female, 99.6% |
Angry | 70% |
Sad | 43.5% |
Surprised | 6.9% |
Fear | 6.3% |
Calm | 2.1% |
Happy | 0.3% |
Disgusted | 0.3% |
Confused | 0.2% |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/684,480,64,77/full/0/native.jpg)
AWS Rekognition
Age | 28-38 |
Gender | Female, 51.8% |
Calm | 75.1% |
Sad | 14.5% |
Fear | 7.5% |
Surprised | 7% |
Happy | 3% |
Confused | 1% |
Angry | 0.6% |
Disgusted | 0.6% |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/489,496,70,96/full/0/native.jpg)
AWS Rekognition
Age | 22-30 |
Gender | Male, 95.6% |
Calm | 98.8% |
Surprised | 6.6% |
Fear | 5.9% |
Sad | 2.2% |
Confused | 0.1% |
Angry | 0.1% |
Disgusted | 0.1% |
Happy | 0% |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/83,46,60,64/full/0/native.jpg)
AWS Rekognition
Age | 13-21 |
Gender | Female, 99.9% |
Sad | 64.4% |
Fear | 31.6% |
Happy | 12.3% |
Angry | 11.1% |
Surprised | 9.5% |
Calm | 6.6% |
Disgusted | 2.9% |
Confused | 1% |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1757,135,54,75/full/0/native.jpg)
AWS Rekognition
Age | 23-31 |
Gender | Male, 67.2% |
Sad | 100% |
Surprised | 6.3% |
Fear | 5.9% |
Calm | 3.6% |
Confused | 0.3% |
Angry | 0.2% |
Disgusted | 0.1% |
Happy | 0.1% |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/926,488,44,71/full/0/native.jpg)
AWS Rekognition
Age | 28-38 |
Gender | Male, 99.5% |
Calm | 90.9% |
Fear | 6.5% |
Surprised | 6.4% |
Sad | 4.5% |
Angry | 0.8% |
Happy | 0.4% |
Disgusted | 0.4% |
Confused | 0.1% |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/2196,527,92,92/full/0/native.jpg)
Microsoft Cognitive Services
Age | 21 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/87,420,89,89/full/0/native.jpg)
Microsoft Cognitive Services
Age | 21 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/410,765,87,87/full/0/native.jpg)
Microsoft Cognitive Services
Age | 68 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/108,1133,86,86/full/0/native.jpg)
Microsoft Cognitive Services
Age | 30 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/921,620,81,81/full/0/native.jpg)
Microsoft Cognitive Services
Age | 21 |
Gender | Male |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1394,239,70,70/full/0/native.jpg)
Microsoft Cognitive Services
Age | 23 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1215,146,66,66/full/0/native.jpg)
Microsoft Cognitive Services
Age | 23 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/2099,119,65,65/full/0/native.jpg)
Microsoft Cognitive Services
Age | 29 |
Gender | Male |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/903,209,64,64/full/0/native.jpg)
Microsoft Cognitive Services
Age | 22 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1198,106,94,110/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Possible |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/406,21,100,117/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/891,582,104,120/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/66,382,108,126/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1174,235,103,120/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1765,1407,131,153/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/2196,484,115,133/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/765,23,90,105/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1387,201,86,98/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very likely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/79,1068,131,153/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Likely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/469,486,95,111/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/874,172,96,112/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/952,119,70,80/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1018,1073,91,106/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/66,19,87,102/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/349,294,102,119/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Unlikely |
Blurred | Very unlikely |
Feature analysis
Amazon
![](https://ids.lib.harvard.edu/ids/iiif/496746811/898,1047,316,591/full/0/native.jpg)
Adult | 98.3% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1977,1002,434,685/full/0/native.jpg)
Adult | 97.4% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1619,206,327,683/full/0/native.jpg)
Adult | 97.1% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1181,1068,266,478/full/0/native.jpg)
Adult | 86.2% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/898,1047,316,591/full/0/native.jpg)
Bride | 98.3% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1619,206,327,683/full/0/native.jpg)
Bride | 97.1% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/898,1047,316,591/full/0/native.jpg)
Female | 98.3% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/9,1055,334,537/full/0/native.jpg)
Female | 97.9% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1977,1002,434,685/full/0/native.jpg)
Female | 97.4% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1619,206,327,683/full/0/native.jpg)
Female | 97.1% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1181,1068,266,478/full/0/native.jpg)
Female | 86.2% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/898,1047,316,591/full/0/native.jpg)
Person | 98.3% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1351,1242,319,483/full/0/native.jpg)
Person | 98.1% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/427,1017,389,605/full/0/native.jpg)
Person | 98.1% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/9,1055,334,537/full/0/native.jpg)
Person | 97.9% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1681,1376,319,321/full/0/native.jpg)
Person | 97.7% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1977,1002,434,685/full/0/native.jpg)
Person | 97.4% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1619,206,327,683/full/0/native.jpg)
Person | 97.1% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/364,15,243,314/full/0/native.jpg)
Person | 95.8% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/879,568,212,384/full/0/native.jpg)
Person | 95.3% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/53,1420,439,289/full/0/native.jpg)
Person | 95% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/43,17,177,371/full/0/native.jpg)
Person | 94.7% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1148,84,209,252/full/0/native.jpg)
Person | 94.6% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1795,650,343,289/full/0/native.jpg)
Person | 94.2% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/703,20,268,422/full/0/native.jpg)
Person | 94.2% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1407,755,214,222/full/0/native.jpg)
Person | 93.5% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1073,513,241,574/full/0/native.jpg)
Person | 92.7% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/21,377,254,248/full/0/native.jpg)
Person | 92.3% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/2062,79,180,196/full/0/native.jpg)
Person | 91.5% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/609,431,275,512/full/0/native.jpg)
Person | 89.9% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/40,521,229,280/full/0/native.jpg)
Person | 89.4% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/370,715,172,198/full/0/native.jpg)
Person | 89.3% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/413,468,300,359/full/0/native.jpg)
Person | 87.5% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/273,280,226,265/full/0/native.jpg)
Person | 87.2% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/801,170,286,359/full/0/native.jpg)
Person | 86.4% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1181,1068,266,478/full/0/native.jpg)
Person | 86.2% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1092,226,218,386/full/0/native.jpg)
Person | 85.9% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1341,178,292,605/full/0/native.jpg)
Person | 85% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/2292,305,239,601/full/0/native.jpg)
Person | 84.4% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/2138,477,218,384/full/0/native.jpg)
Person | 80.5% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1743,47,117,179/full/0/native.jpg)
Person | 79.4% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/2382,1198,145,491/full/0/native.jpg)
Person | 79.1% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/343,230,184,191/full/0/native.jpg)
Person | 72.1% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/478,159,222,283/full/0/native.jpg)
Person | 71.6% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1841,70,137,198/full/0/native.jpg)
Person | 71.1% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/2037,65,108,166/full/0/native.jpg)
Person | 64.2% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/898,1047,316,591/full/0/native.jpg)
Woman | 98.3% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1977,1002,434,685/full/0/native.jpg)
Woman | 97.4% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1619,206,327,683/full/0/native.jpg)
Woman | 97.1% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1181,1068,266,478/full/0/native.jpg)
Woman | 86.2% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1351,1242,319,483/full/0/native.jpg)
Baby | 98.1% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/427,1017,389,605/full/0/native.jpg)
Baby | 98.1% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/53,1420,439,289/full/0/native.jpg)
Baby | 95% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/609,431,275,512/full/0/native.jpg)
Baby | 89.9% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/1341,178,292,605/full/0/native.jpg)
Baby | 85% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/9,1055,334,537/full/0/native.jpg)
Child | 97.9% | |
![](https://ids.lib.harvard.edu/ids/iiif/496746811/9,1055,334,537/full/0/native.jpg)
Girl | 97.9% | |
Categories
Imagga
paintings art | 57.6% | |
interior objects | 35.3% | |
streetview architecture | 2.9% | |
people portraits | 1.9% | |
Captions
Microsoft
created on 2023-04-27
a group of people posing for a photo | 92.6% | |
a group of people posing for the camera | 92.5% | |
a group of people posing for a picture | 92.4% | |
Anthropic Claude
Created by claude-3-haiku-20240307 on 2024-12-31
This image appears to depict a bustling indoor market scene, with numerous people gathered around a crowded table displaying a variety of goods, including what seem to be various types of baked goods or pastries. The people in the image are wearing clothing that suggests this is an older, potentially historical setting. Many of the individuals have hats, scarves, and other accessories that give the scene a distinctive period feel. The overall composition and lighting of the image create a sense of energy and vibrancy within the crowded space.
Created by claude-3-opus-20240229 on 2024-12-31
The black and white photograph shows a large group of people, mostly women and children, gathered around a long table or set of tables filled with rows and rows of bowls and plates. Many of the women are wearing patterned dresses or headscarves, while some of the children appear to be sitting on the floor around the table. The image has an old-fashioned, vintage quality to it based on the style of clothing and the grainy film. It seems to depict some kind of communal meal or gathering, perhaps at a public event, shelter, or soup kitchen given the plain surroundings and large number of simple place settings.
Created by claude-3-5-sonnet-20241022 on 2024-12-31
This appears to be a historical black and white photograph showing a large group of people in what looks like a food service or soup kitchen setting. There are many white bowls or containers visible on long tables, and the people are dressed in clothing that appears to be from the 1930s or 1940s, with women wearing dresses with polka dots and floral patterns, and head coverings. The image has an overhead perspective, showing the crowd gathered around what seems to be serving tables. The scene suggests this might be from the Great Depression era or another period when community food service was common. The atmosphere appears busy and crowded, with many people waiting in line or gathered around the serving area.