Human Generated Data

Title

Market, Juchitlan

Date

1930s

People

Artist: Emilio Amero, Mexican 1901 - 1976

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Richard and Ronay Menschel Fund for the Acquisition of Photographs, 2022.367

Copyright

© Estate of the Artist

Human Generated Data

Title

Market, Juchitlan

People

Artist: Emilio Amero, Mexican 1901 - 1976

Date

1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Richard and Ronay Menschel Fund for the Acquisition of Photographs, 2022.367

Copyright

© Estate of the Artist

Machine Generated Data

Tags

Amazon
created on 2023-04-27

Food 99.9
Meal 99.9
Cafeteria 99.9
Indoors 99.9
Restaurant 99.9
Clothing 99.9
Hat 99.9
Architecture 99.6
Building 99.6
Dining Room 99.6
Dining Table 99.6
Furniture 99.6
Room 99.6
Table 99.6
Adult 98.3
Bride 98.3
Female 98.3
Person 98.3
Wedding 98.3
Woman 98.3
Person 98.1
Baby 98.1
Person 98.1
Baby 98.1
Female 97.9
Person 97.9
Child 97.9
Girl 97.9
Person 97.7
Adult 97.4
Female 97.4
Person 97.4
Woman 97.4
Adult 97.1
Bride 97.1
Female 97.1
Person 97.1
Woman 97.1
People 96.3
Person 95.8
Person 95.3
Person 95
Baby 95
Person 94.7
Person 94.6
Person 94.2
Person 94.2
Person 93.5
Person 92.7
Person 92.3
Face 92.1
Head 92.1
Person 91.5
Person 89.9
Baby 89.9
Person 89.4
Person 89.3
Dish 88.1
Person 87.5
Person 87.2
Person 86.4
Adult 86.2
Female 86.2
Person 86.2
Woman 86.2
Person 85.9
Person 85
Baby 85
Person 84.4
Person 80.5
Person 79.4
Person 79.1
Person 72.1
Person 71.6
Person 71.1
Person 64.2
Food Court 57.2
Crowd 56.6
Fun 55.7
Party 55.7
Buffet 55.6

Clarifai
created on 2023-10-13

people 100
group 99.8
many 99.6
woman 98.6
adult 98.3
child 97
group together 96.8
man 96.2
boy 94.4
wear 94.3
several 89.4
war 89
monochrome 88.5
recreation 86.8
veil 84.6
administration 83.5
music 83.3
street 81.9
outfit 75.7
military 75.6

Imagga
created on 2023-04-27

kin 25.3
art 16.6
decoration 16.3
grunge 13.6
celebration 13.5
cemetery 13.5
design 13
black 12.7
flower 12.3
retro 12.3
pattern 11.6
people 11.1
vintage 10.7
holiday 10.7
fun 10.5
floral 10.2
happy 10
male 10
silhouette 9.9
clock 9.6
man 9.6
person 9.5
play 9.5
symbol 9.4
old 9
dress 9
drawing 9
child 8.9
gift 8.6
fashion 8.3
cartoon 8
flowers 7.8
color 7.8
fear 7.7
money 7.6
decorative 7.5
mother 7.5
antique 7.3
cash 7.3
graphic 7.3
paint 7.2
dirty 7.2
collection 7.2
cute 7.2
religion 7.2
card 7.1
family 7.1

Google
created on 2023-04-27

Photograph 94.2
Hat 84.9
Art 78.7
Suit 77.4
Table 75.2
Snapshot 74.3
Sun hat 74.3
Event 73.6
Vintage clothing 71.7
Font 71.2
Stock photography 64.1
History 63.7
Room 63.6
Photo caption 61.1
Monochrome 59.3
Visual arts 57.4
Family reunion 56.4
Pattern 55.3
Crowd 55.1
Circle 52.5

Microsoft
created on 2023-04-27

person 96.7
text 94
clothing 91.1
posing 91
drawing 86
cartoon 82.7
group 82.5
sketch 62.5
old 56.9
black and white 51.3
team 32

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 7-17
Gender Female, 92.5%
Angry 59.7%
Calm 37.5%
Surprised 6.5%
Fear 6%
Sad 2.4%
Confused 0.8%
Disgusted 0.4%
Happy 0.2%

AWS Rekognition

Age 16-22
Gender Female, 100%
Calm 52.4%
Sad 22.4%
Surprised 19.3%
Fear 6.7%
Confused 4.3%
Angry 3.9%
Disgusted 3.4%
Happy 1.2%

AWS Rekognition

Age 13-21
Gender Female, 100%
Calm 93.2%
Surprised 6.7%
Fear 6%
Sad 3.6%
Angry 1%
Confused 0.4%
Disgusted 0.2%
Happy 0.1%

AWS Rekognition

Age 22-30
Gender Female, 100%
Sad 100%
Calm 11.9%
Surprised 7%
Fear 6%
Angry 1.9%
Disgusted 0.9%
Confused 0.8%
Happy 0.2%

AWS Rekognition

Age 13-21
Gender Male, 98.2%
Calm 92%
Surprised 6.3%
Fear 5.9%
Sad 4.3%
Angry 2.1%
Confused 0.2%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 23-31
Gender Female, 58.1%
Calm 53%
Sad 23.8%
Surprised 11.7%
Fear 10.8%
Happy 6.4%
Angry 1.4%
Confused 1.4%
Disgusted 1.3%

AWS Rekognition

Age 16-24
Gender Female, 97.7%
Calm 76.6%
Sad 28.1%
Surprised 7.2%
Fear 6%
Confused 0.5%
Angry 0.3%
Disgusted 0.2%
Happy 0.1%

AWS Rekognition

Age 16-22
Gender Female, 99.8%
Sad 99%
Calm 30.3%
Happy 7.5%
Surprised 6.8%
Fear 6.1%
Disgusted 0.7%
Angry 0.7%
Confused 0.6%

AWS Rekognition

Age 13-21
Gender Female, 50.8%
Calm 65.9%
Confused 12.5%
Surprised 9.2%
Fear 6.4%
Angry 5.1%
Happy 4.7%
Sad 3.1%
Disgusted 2.8%

AWS Rekognition

Age 18-26
Gender Male, 99.6%
Calm 68.3%
Sad 10.2%
Fear 8.3%
Confused 8%
Surprised 7.6%
Angry 1.7%
Happy 1.6%
Disgusted 1%

AWS Rekognition

Age 20-28
Gender Female, 99.8%
Calm 53.9%
Angry 29.4%
Sad 8.4%
Surprised 6.9%
Fear 6.5%
Confused 1.8%
Disgusted 1.3%
Happy 0.8%

AWS Rekognition

Age 6-14
Gender Female, 96.8%
Angry 43.5%
Calm 20%
Surprised 12.6%
Happy 11.2%
Fear 7.3%
Sad 7.3%
Confused 3%
Disgusted 1.1%

AWS Rekognition

Age 4-12
Gender Male, 68.9%
Sad 88.6%
Calm 58.3%
Surprised 6.3%
Fear 5.9%
Confused 0.2%
Angry 0.1%
Happy 0.1%
Disgusted 0.1%

AWS Rekognition

Age 13-21
Gender Female, 98.1%
Calm 77.7%
Angry 13.5%
Surprised 7.1%
Fear 6.3%
Sad 3.7%
Happy 1.3%
Confused 0.6%
Disgusted 0.5%

AWS Rekognition

Age 26-36
Gender Male, 87.8%
Calm 95.2%
Surprised 6.4%
Fear 5.9%
Sad 2.7%
Happy 0.8%
Confused 0.7%
Angry 0.6%
Disgusted 0.5%

AWS Rekognition

Age 35-43
Gender Male, 99.5%
Calm 98.8%
Surprised 6.3%
Fear 5.9%
Sad 2.4%
Angry 0.1%
Disgusted 0%
Happy 0%
Confused 0%

AWS Rekognition

Age 6-14
Gender Female, 100%
Sad 100%
Surprised 6.3%
Fear 5.9%
Calm 0.1%
Happy 0%
Angry 0%
Confused 0%
Disgusted 0%

AWS Rekognition

Age 12-20
Gender Female, 99.6%
Angry 70%
Sad 43.5%
Surprised 6.9%
Fear 6.3%
Calm 2.1%
Happy 0.3%
Disgusted 0.3%
Confused 0.2%

AWS Rekognition

Age 28-38
Gender Female, 51.8%
Calm 75.1%
Sad 14.5%
Fear 7.5%
Surprised 7%
Happy 3%
Confused 1%
Angry 0.6%
Disgusted 0.6%

AWS Rekognition

Age 22-30
Gender Male, 95.6%
Calm 98.8%
Surprised 6.6%
Fear 5.9%
Sad 2.2%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%
Happy 0%

AWS Rekognition

Age 13-21
Gender Female, 99.9%
Sad 64.4%
Fear 31.6%
Happy 12.3%
Angry 11.1%
Surprised 9.5%
Calm 6.6%
Disgusted 2.9%
Confused 1%

AWS Rekognition

Age 23-31
Gender Male, 67.2%
Sad 100%
Surprised 6.3%
Fear 5.9%
Calm 3.6%
Confused 0.3%
Angry 0.2%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 28-38
Gender Male, 99.5%
Calm 90.9%
Fear 6.5%
Surprised 6.4%
Sad 4.5%
Angry 0.8%
Happy 0.4%
Disgusted 0.4%
Confused 0.1%

Microsoft Cognitive Services

Age 21
Gender Female

Microsoft Cognitive Services

Age 21
Gender Female

Microsoft Cognitive Services

Age 68
Gender Female

Microsoft Cognitive Services

Age 30
Gender Female

Microsoft Cognitive Services

Age 21
Gender Male

Microsoft Cognitive Services

Age 23
Gender Female

Microsoft Cognitive Services

Age 23
Gender Female

Microsoft Cognitive Services

Age 29
Gender Male

Microsoft Cognitive Services

Age 22
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult
Bride
Female
Person
Woman
Baby
Child
Girl
Adult 98.3%
Adult 97.4%
Adult 97.1%
Adult 86.2%
Bride 98.3%
Bride 97.1%
Female 98.3%
Female 97.9%
Female 97.4%
Female 97.1%
Female 86.2%
Person 98.3%
Person 98.1%
Person 98.1%
Person 97.9%
Person 97.7%
Person 97.4%
Person 97.1%
Person 95.8%
Person 95.3%
Person 95%
Person 94.7%
Person 94.6%
Person 94.2%
Person 94.2%
Person 93.5%
Person 92.7%
Person 92.3%
Person 91.5%
Person 89.9%
Person 89.4%
Person 89.3%
Person 87.5%
Person 87.2%
Person 86.4%
Person 86.2%
Person 85.9%
Person 85%
Person 84.4%
Person 80.5%
Person 79.4%
Person 79.1%
Person 72.1%
Person 71.6%
Person 71.1%
Person 64.2%
Woman 98.3%
Woman 97.4%
Woman 97.1%
Woman 86.2%
Baby 98.1%
Baby 98.1%
Baby 95%
Baby 89.9%
Baby 85%
Child 97.9%
Girl 97.9%

Captions

Anthropic Claude

Created by claude-3-haiku-20240307 on 2024-12-31

This image appears to depict a bustling indoor market scene, with numerous people gathered around a crowded table displaying a variety of goods, including what seem to be various types of baked goods or pastries. The people in the image are wearing clothing that suggests this is an older, potentially historical setting. Many of the individuals have hats, scarves, and other accessories that give the scene a distinctive period feel. The overall composition and lighting of the image create a sense of energy and vibrancy within the crowded space.

Created by claude-3-opus-20240229 on 2024-12-31

The black and white photograph shows a large group of people, mostly women and children, gathered around a long table or set of tables filled with rows and rows of bowls and plates. Many of the women are wearing patterned dresses or headscarves, while some of the children appear to be sitting on the floor around the table. The image has an old-fashioned, vintage quality to it based on the style of clothing and the grainy film. It seems to depict some kind of communal meal or gathering, perhaps at a public event, shelter, or soup kitchen given the plain surroundings and large number of simple place settings.

Created by claude-3-5-sonnet-20241022 on 2024-12-31

This appears to be a historical black and white photograph showing a large group of people in what looks like a food service or soup kitchen setting. There are many white bowls or containers visible on long tables, and the people are dressed in clothing that appears to be from the 1930s or 1940s, with women wearing dresses with polka dots and floral patterns, and head coverings. The image has an overhead perspective, showing the crowd gathered around what seems to be serving tables. The scene suggests this might be from the Great Depression era or another period when community food service was common. The atmosphere appears busy and crowded, with many people waiting in line or gathered around the serving area.