Human Generated Data

Title

Untitled (wedding guests seated outside on folding chairs)

Date

1953

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8739

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (wedding guests seated outside on folding chairs)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8739

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Furniture 99.7
Chair 99.7
Person 99.7
Human 99.7
Person 99.7
Person 99.6
Person 98.9
Person 98.4
Apparel 97.4
Clothing 97.4
Outdoors 96.5
Nature 96.5
Countryside 96.5
Building 96.5
Shelter 96.5
Rural 96.5
Person 93.2
Face 87.7
Person 86.4
Dress 83.5
People 82
Person 76.8
Yard 76.6
Female 73.7
Kid 70.5
Child 70.5
Crowd 68.9
Girl 68.2
Table 66.6
Photography 64.9
Photo 64.9
Person 62.7
Housing 58
Hat 55.8
Person 44.8

Clarifai
created on 2023-10-25

people 99.9
group 99.2
group together 99
many 97.4
adult 96.7
child 95.7
man 94.8
woman 93.5
wear 93.5
administration 93.4
leader 92.5
several 92.4
outfit 89.5
music 89.4
recreation 87.7
war 87.6
chair 85.4
soldier 82.5
home 81
military 74.4

Imagga
created on 2022-01-09

chair 64.8
seat 39.9
folding chair 34.6
furniture 24.4
man 23.5
people 23.4
room 16.8
city 16.6
business 16.4
sitting 15.5
building 15.1
person 14.4
male 14.2
urban 14
outdoors 13.4
interior 13.3
men 12
street 12
adult 11.9
work 11.9
table 11.2
restaurant 11
architecture 10.9
house 10.9
suit 10.8
chairs 10.8
outdoor 10.7
modern 10.5
sit 10.4
lifestyle 10.1
crutch 10.1
leisure 10
old 9.7
musical instrument 9.4
day 9.4
outside 9.4
relax 9.3
travel 9.1
fashion 9
style 8.9
businessman 8.8
wall 8.5
window 8.5
classroom 8.3
school 8.2
home 8
smiling 8
design 7.9
couple 7.8
standing 7.8
staff 7.8
furnishing 7.7
attractive 7.7
rest 7.6
pedestrian 7.6
office 7.5
floor 7.4
vacation 7.4
group 7.2
computer 7.2
road 7.2
summer 7.1
working 7.1
indoors 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

outdoor 97.2
chair 94.1
furniture 91.4
text 90.2
person 88.7
clothing 88.1
black and white 80.3
table 66.8
man 58.5
house 54.3
footwear 53.4
crowd 1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 42-50
Gender Female, 72.8%
Calm 99.5%
Happy 0.2%
Sad 0.1%
Surprised 0.1%
Disgusted 0%
Confused 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 23-33
Gender Male, 64.2%
Happy 87.5%
Calm 7.1%
Surprised 2.5%
Disgusted 1.3%
Angry 0.7%
Confused 0.6%
Sad 0.3%
Fear 0.1%

AWS Rekognition

Age 34-42
Gender Female, 98.2%
Calm 95.2%
Sad 3.5%
Surprised 0.4%
Disgusted 0.4%
Angry 0.3%
Confused 0.2%
Happy 0.1%
Fear 0.1%

AWS Rekognition

Age 24-34
Gender Female, 53.5%
Calm 99.9%
Happy 0%
Sad 0%
Surprised 0%
Disgusted 0%
Fear 0%
Confused 0%
Angry 0%

AWS Rekognition

Age 35-43
Gender Female, 55.6%
Happy 92.1%
Calm 6.3%
Surprised 1.2%
Confused 0.2%
Disgusted 0.1%
Angry 0.1%
Sad 0.1%
Fear 0%

AWS Rekognition

Age 29-39
Gender Female, 67.2%
Calm 64.5%
Angry 21.2%
Sad 4.3%
Happy 3.1%
Confused 2.6%
Surprised 2.2%
Fear 1.4%
Disgusted 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Chair 99.7%
Person 99.7%

Text analysis

Amazon

38605
58
KODAK

Google

58 YT37A°2-A 38605
58
YT37A°2-A
38605