Human Generated Data

Title

Untitled (woman throwing Tupperware container across room during demonstration)

Date

1954

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8815

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman throwing Tupperware container across room during demonstration)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1954

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8815

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.5
Human 99.5
Person 98.8
Person 98
Person 97.8
Person 97.5
Person 95.5
Chair 95
Furniture 95
Person 91.2
People 89
Person 88.4
Person 87.3
Person 86.9
Person 70.6
Person 69.4
Worker 68.9
Hairdresser 67.3
Female 64.9
Leisure Activities 64.7
Person 64.4
Crowd 64.3
Photography 63.4
Photo 63.4
Apparel 61.9
Clothing 61.9
Portrait 60.8
Face 60.8
Room 59.1
Indoors 59.1
Musical Instrument 57.4
Musician 57.4
Girl 56.7
Person 45.1

Clarifai
created on 2023-10-25

people 99.9
group 99.1
group together 98.6
adult 98.2
man 96.5
woman 96.1
child 94
furniture 93.9
music 93.4
monochrome 92.5
several 92.1
many 91.7
sit 90.7
actress 90
chair 89.7
administration 87.8
musician 87.7
recreation 87.6
actor 87
outfit 86.8

Imagga
created on 2022-01-09

salon 88.3
shop 42.3
toyshop 31.2
mercantile establishment 26.9
man 18.1
place of business 18
people 17.8
person 12.8
adult 12.4
table 12.1
male 12
glass 11.7
interior 11.5
chair 11.4
indoors 11.4
women 11.1
black 10.8
decoration 10.1
equipment 10.1
room 9.7
style 9.6
men 9.4
leisure 9.1
fashion 9
retro 9
establishment 9
lifestyle 8.7
holiday 8.6
party 8.6
luxury 8.6
instrument 8.4
horizontal 8.4
sport 8.2
indoor 8.2
shoe shop 8.2
home 8
education 7.8
sitting 7.7
setting 7.7
knife 7.7
health 7.6
city 7.5
vintage 7.4
clothing 7.4
shopping 7.3
music 7.3
group 7.2
decor 7.1
medical 7.1
medicine 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

person 99.1
text 95.9
clothing 91.8
people 80.7
group 65.8
table 60.7
woman 54.6
family 15.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 89.8%
Calm 99.8%
Happy 0.1%
Sad 0.1%
Disgusted 0%
Confused 0%
Surprised 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 23-31
Gender Male, 55.4%
Happy 99.8%
Surprised 0.2%
Calm 0%
Confused 0%
Disgusted 0%
Angry 0%
Fear 0%
Sad 0%

AWS Rekognition

Age 28-38
Gender Male, 53.6%
Calm 60.6%
Sad 20%
Happy 13.2%
Surprised 2.1%
Angry 2.1%
Disgusted 0.8%
Confused 0.8%
Fear 0.4%

AWS Rekognition

Age 28-38
Gender Male, 86.8%
Calm 99.2%
Surprised 0.4%
Sad 0.2%
Angry 0.1%
Disgusted 0.1%
Confused 0.1%
Happy 0%
Fear 0%

AWS Rekognition

Age 31-41
Gender Female, 88.3%
Calm 100%
Sad 0%
Happy 0%
Confused 0%
Surprised 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 28-38
Gender Male, 99.7%
Calm 99.3%
Sad 0.2%
Confused 0.2%
Happy 0.1%
Surprised 0.1%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 28-38
Gender Male, 95.9%
Calm 34.5%
Happy 30.8%
Sad 25.5%
Confused 3.3%
Disgusted 3.2%
Surprised 1.2%
Angry 0.7%
Fear 0.7%

AWS Rekognition

Age 33-41
Gender Male, 94.6%
Calm 89.4%
Happy 5.4%
Sad 2%
Confused 1.9%
Disgusted 0.5%
Surprised 0.3%
Angry 0.2%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Chair 95%

Text analysis

Amazon

39465.
KODVK
في
A
Compress A في
Compress

Google

39465.
39465.