Human Generated Data

Title

Untitled (women gathered in living room for Tupperware party)

Date

1954

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8806

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (women gathered in living room for Tupperware party)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1954

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8806

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.6
Human 99.6
Person 99.4
Person 99.3
Person 98.3
Person 97.3
Person 96.1
Person 94.6
Furniture 94.3
Chair 94.3
Person 89.2
Person 88.3
Person 87.8
Room 87.7
Indoors 87.7
Person 87.5
People 87.2
Clothing 83.3
Apparel 83.3
Person 82.2
Person 75
Living Room 66.8
Person 65.1
Person 61.4
Suit 57.9
Overcoat 57.9
Coat 57.9
Photography 57.7
Photo 57.7
Leisure Activities 57
Musician 56.9
Musical Instrument 56.9
Crowd 56.7
Couch 56.3

Clarifai
created on 2023-10-25

people 99.9
group together 99.3
group 98.6
adult 98
woman 97.4
man 96.5
child 94.4
sit 93.9
furniture 93.7
administration 92.3
many 91.3
room 90.9
monochrome 90.5
recreation 90.3
several 90.1
sewing machine 89.8
actor 88.4
chair 87.2
boy 87
employee 85

Imagga
created on 2022-01-09

salon 29.6
shop 28.3
people 20.6
musical instrument 19.3
person 18.4
man 16.8
mercantile establishment 16.6
men 16.3
male 16.3
black 15.6
adult 14.5
city 14.1
percussion instrument 13.2
urban 13.1
drum 12.2
barbershop 12
style 11.9
women 11.8
life 11.6
musician 11.5
place of business 11.1
music 10.9
brass 10.9
lifestyle 10.8
fashion 10.5
indoors 10.5
toyshop 10.5
room 10.5
instrument 10.3
interior 9.7
portrait 9.7
group 9.7
chair 9.2
modern 9.1
business 9.1
working 8.8
wind instrument 8.8
table 8.6
musical 8.6
holiday 8.6
glass 8.5
window 8.5
player 8.5
bass 8.1
classroom 8.1
home 8
art 7.9
guitar 7.8
play 7.7
travel 7.7
motion 7.7
sport 7.6
human 7.5
equipment 7.2
decoration 7.2
clothing 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

clothing 96.8
text 96.1
person 94.8
people 75.7
group 57.9
woman 57.2
musical instrument 50.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Female, 97.6%
Calm 70.1%
Happy 14%
Sad 6.4%
Surprised 4.6%
Angry 3.5%
Disgusted 0.8%
Confused 0.5%
Fear 0.2%

AWS Rekognition

Age 29-39
Gender Female, 71.7%
Calm 97.3%
Happy 0.8%
Surprised 0.8%
Sad 0.7%
Confused 0.2%
Disgusted 0.1%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 26-36
Gender Female, 82.8%
Calm 99.9%
Sad 0%
Angry 0%
Surprised 0%
Confused 0%
Fear 0%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 27-37
Gender Male, 99.8%
Calm 76.3%
Sad 22.3%
Confused 0.6%
Surprised 0.3%
Disgusted 0.2%
Happy 0.1%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 38-46
Gender Male, 67.5%
Calm 98.4%
Happy 0.8%
Surprised 0.4%
Confused 0.1%
Disgusted 0.1%
Sad 0.1%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 36-44
Gender Male, 93.9%
Calm 98%
Surprised 0.7%
Sad 0.4%
Disgusted 0.3%
Confused 0.3%
Fear 0.2%
Happy 0.1%
Angry 0.1%

AWS Rekognition

Age 33-41
Gender Male, 99.8%
Calm 99.7%
Happy 0.1%
Confused 0.1%
Sad 0.1%
Disgusted 0%
Surprised 0%
Fear 0%
Angry 0%

AWS Rekognition

Age 43-51
Gender Female, 50.8%
Sad 94.9%
Happy 2.5%
Calm 0.9%
Confused 0.6%
Disgusted 0.5%
Angry 0.3%
Fear 0.2%
Surprised 0.2%

AWS Rekognition

Age 24-34
Gender Male, 90.8%
Calm 99.6%
Happy 0.1%
Sad 0.1%
Confused 0.1%
Surprised 0.1%
Fear 0.1%
Disgusted 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Chair 94.3%

Text analysis

Amazon

39464.
for
Appeal for
Appeal

Google

39464. |MJIヨ--YTヨ3A°2--AgO
39464.
MJI
--YT
3A
°
2
--
AgO
|