Human Generated Data

Title

Untitled (wedding guests seated and under umbrellas)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8616

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (wedding guests seated and under umbrellas)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8616

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 98.4
Human 98.4
Person 97.4
Person 97.3
Person 97
Person 96.7
Person 94.9
Person 92.5
Person 92.3
Clothing 91.8
Apparel 91.8
Nature 91.6
Outdoors 91.2
Person 89
Person 88.6
Person 88.5
Person 88.3
Person 88.2
Person 85.5
Face 84.2
Rural 80.6
Shelter 80.6
Building 80.6
Countryside 80.6
Female 79.3
People 79.2
Meal 75
Food 75
Crowd 71.4
Person 70.4
Suit 67.4
Coat 67.4
Overcoat 67.4
Person 65.2
Girl 63.4
Leisure Activities 61.9
Woman 61.1
Funeral 59.5
Plant 58.3
Vacation 58.3
Picnic 57.4
Field 55.1
Person 54.5

Clarifai
created on 2023-10-25

people 100
adult 98.3
military 98.1
group 97.6
many 97.5
man 96.3
group together 96.1
administration 93.5
war 91.7
soldier 91.6
woman 91.2
leader 89.9
wear 88.7
child 88.2
vehicle 81.6
tent 81.5
uniform 80.8
outfit 77.5
military uniform 75.6
campsite 75.2

Imagga
created on 2022-01-09

musical instrument 30.2
wind instrument 20.6
accordion 19.7
old 18.1
newspaper 16.8
man 16.8
dirty 16.3
keyboard instrument 16.1
industrial 14.5
rifle 14.4
product 13.9
person 13.9
danger 13.6
mask 13.4
landscape 13.4
gun 13.3
silhouette 13.2
vintage 13.2
black 13.2
destruction 12.7
toxic 12.7
travel 12.7
sky 12.1
grunge 11.9
art 11.9
protection 11.8
nuclear 11.6
fog 11.6
park 11.5
environment 11.5
male 11.3
smoke 11.2
creation 11
stalker 10.9
dark 10.9
radioactive 10.8
symbol 10.8
protective 10.7
military 10.6
chemical 10.6
people 10.6
gas 10.6
building 10.5
water 10
vacation 9.8
radiation 9.8
soldier 9.8
accident 9.8
forest 9.6
ancient 9.5
power 9.2
outdoor 9.2
city 9.1
tourism 9.1
structure 9
disaster 8.8
steam 8.7
scene 8.7
cold 8.6
snow 8.4
adult 8.4
summer 8.4
history 8
weapon 8
firearm 8
mountain 8
sax 8
antique 7.8
protect 7.7
industry 7.7
winter 7.7
texture 7.6
tourist 7.6
adventure 7.6
daily 7.6
retro 7.4
safety 7.4
part 7.4
stone 7.3
sun 7.2
scenery 7.2
holiday 7.2
trees 7.1
world 7.1
day 7.1
paper 7.1
country 7
scenic 7
season 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 98.5
outdoor 95.5
clothing 94
person 92.1
woman 72.4
dress 70.4
dance 69.1
black and white 65
man 62.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Female, 71.3%
Calm 99.9%
Happy 0%
Surprised 0%
Disgusted 0%
Sad 0%
Angry 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 48-54
Gender Male, 77.7%
Sad 66.5%
Confused 14.2%
Calm 10.5%
Fear 2.2%
Surprised 2.1%
Disgusted 1.7%
Angry 1.4%
Happy 1.3%

AWS Rekognition

Age 35-43
Gender Male, 100%
Calm 99.7%
Surprised 0.1%
Happy 0.1%
Sad 0%
Confused 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 41-49
Gender Male, 77.2%
Calm 69.8%
Confused 22.6%
Disgusted 4%
Fear 1%
Sad 0.9%
Happy 0.9%
Surprised 0.6%
Angry 0.3%

AWS Rekognition

Age 28-38
Gender Male, 94.8%
Calm 99.9%
Sad 0%
Happy 0%
Confused 0%
Surprised 0%
Fear 0%
Disgusted 0%
Angry 0%

AWS Rekognition

Age 48-56
Gender Female, 81.3%
Calm 96.9%
Happy 2%
Surprised 0.4%
Disgusted 0.3%
Confused 0.2%
Sad 0.1%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 22-30
Gender Female, 60.6%
Calm 96.9%
Surprised 1.7%
Fear 0.4%
Sad 0.4%
Confused 0.3%
Happy 0.2%
Disgusted 0.1%
Angry 0.1%

AWS Rekognition

Age 36-44
Gender Female, 99.6%
Calm 74.1%
Sad 9.2%
Happy 9.1%
Angry 3.4%
Confused 1.3%
Fear 1%
Surprised 1%
Disgusted 0.8%

AWS Rekognition

Age 27-37
Gender Male, 96.4%
Calm 91.6%
Happy 5.1%
Sad 1.4%
Angry 0.6%
Surprised 0.4%
Disgusted 0.3%
Confused 0.3%
Fear 0.3%

AWS Rekognition

Age 23-31
Gender Male, 95.6%
Fear 53.6%
Calm 29.9%
Happy 8.5%
Sad 2.8%
Surprised 2.7%
Angry 1.1%
Confused 0.8%
Disgusted 0.6%

AWS Rekognition

Age 43-51
Gender Female, 50.1%
Calm 99.6%
Happy 0.2%
Surprised 0.1%
Sad 0%
Angry 0%
Disgusted 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 28-38
Gender Male, 82.4%
Calm 99.7%
Sad 0.1%
Angry 0.1%
Disgusted 0%
Happy 0%
Confused 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 24-34
Gender Male, 83.1%
Calm 99.7%
Sad 0.1%
Happy 0.1%
Disgusted 0.1%
Confused 0%
Fear 0%
Surprised 0%
Angry 0%

AWS Rekognition

Age 48-54
Gender Male, 70.7%
Calm 99.8%
Sad 0.1%
Happy 0%
Confused 0%
Disgusted 0%
Surprised 0%
Angry 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 98.4%

Text analysis

Amazon

19980
19980.
3

Google

19980. 19980.
19980.