Human Generated Data

Title

Untitled (woman throwing Tupperware container across room during demonstration)

Date

1954

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8814

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman throwing Tupperware container across room during demonstration)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1954

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8814

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.3
Human 99.3
Person 99.1
Person 98.4
Person 96.7
Person 96.3
Person 95.7
Person 95.2
Person 92.9
Furniture 91
Person 90.9
People 89.3
Person 88.5
Person 87.6
Musician 86.1
Musical Instrument 86.1
Leisure Activities 84.8
Person 80.9
Chair 77.9
Clothing 70.1
Apparel 70.1
Crowd 67.6
Room 66.3
Indoors 66.3
Music Band 66
Person 63.8
Suit 60.7
Coat 60.7
Overcoat 60.7
Photography 57.8
Photo 57.8
Living Room 55.8
Person 54

Clarifai
created on 2023-10-25

people 99.9
group together 99.2
group 99
adult 98.2
furniture 97.5
man 97.1
woman 95.9
sit 94.8
many 94.5
recreation 93.9
monochrome 92.4
actor 90.8
music 90.2
room 89.5
actress 89.1
child 88.4
several 88.3
chair 87.9
sitting 84.4
guitar 84.1

Imagga
created on 2022-01-09

shop 72
mercantile establishment 50.5
barbershop 40.4
salon 39.8
toyshop 37.1
place of business 33.7
man 17.5
people 17.3
establishment 16.8
glass 13.6
person 12.4
adult 11.7
black 11.4
chair 10.8
style 10.4
table 10.4
play 10.3
work 9.8
equipment 9.7
indoors 9.7
women 9.5
decoration 9.4
male 9.2
group 8.9
interior 8.8
working 8.8
luxury 8.6
business 8.5
instrument 8.4
city 8.3
music 8.3
shopping 8.2
occupation 8.2
sport 8.2
clothing 8.1
lifestyle 7.9
men 7.7
modern 7.7
professional 7.7
window 7.6
fashion 7.5
buy 7.5
human 7.5
leisure 7.5
technology 7.4
bass 7.4
celebration 7.2
team 7.2
holiday 7.2
life 7.1
portrait 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

person 98.4
text 98.3
clothing 89.8
people 75.5
group 70.7
old 63.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Male, 96%
Happy 99.3%
Surprised 0.2%
Confused 0.2%
Calm 0.1%
Disgusted 0.1%
Fear 0%
Sad 0%
Angry 0%

AWS Rekognition

Age 45-51
Gender Male, 98.3%
Calm 98.9%
Sad 0.9%
Confused 0%
Happy 0%
Disgusted 0%
Surprised 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 24-34
Gender Female, 95.5%
Calm 98.1%
Surprised 0.7%
Confused 0.6%
Disgusted 0.3%
Angry 0.1%
Sad 0.1%
Happy 0.1%
Fear 0%

AWS Rekognition

Age 31-41
Gender Female, 64.6%
Calm 98.8%
Happy 0.8%
Confused 0.1%
Sad 0.1%
Disgusted 0.1%
Surprised 0.1%
Angry 0%
Fear 0%

AWS Rekognition

Age 33-41
Gender Male, 99.8%
Surprised 94.9%
Calm 4%
Fear 0.3%
Disgusted 0.2%
Sad 0.2%
Angry 0.2%
Confused 0.1%
Happy 0%

AWS Rekognition

Age 39-47
Gender Male, 99%
Calm 99.6%
Happy 0.2%
Sad 0.1%
Confused 0.1%
Surprised 0%
Disgusted 0%
Fear 0%
Angry 0%

AWS Rekognition

Age 34-42
Gender Male, 99.5%
Calm 83.4%
Sad 6.9%
Disgusted 6.4%
Confused 1.5%
Happy 0.8%
Fear 0.4%
Surprised 0.3%
Angry 0.3%

AWS Rekognition

Age 28-38
Gender Male, 96.5%
Calm 65.8%
Sad 21.8%
Happy 9.8%
Disgusted 0.7%
Confused 0.7%
Surprised 0.5%
Fear 0.4%
Angry 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Chair 77.9%

Text analysis

Amazon

39465-A
KODVK