Human Generated Data

Title

Untitled (women seated in living room at Tupperware party)

Date

1958

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7199

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (women seated in living room at Tupperware party)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1958

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7199

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99
Human 99
Person 99
Person 98.5
Person 97.7
Person 92.9
Indoors 90.9
Room 90.9
Person 90
Person 84.3
Person 79.3
Person 78.7
Person 75.3
Interior Design 74.6
People 72.1
Crowd 71.5
Classroom 68.4
School 68.4
Jury 64.5
Clinic 61.5
Court 58.9
Audience 55.1
Person 48.3

Clarifai
created on 2023-10-25

people 99.5
education 99
teacher 97.3
monochrome 97.1
school 96.4
classroom 96.4
child 96.1
group 95.8
adult 94.5
woman 94.5
man 93.2
elementary school 92.1
indoors 92
sit 91.1
chair 86.9
many 81.6
horizontal 80.2
room 79.3
communication 77.6
furniture 77.1

Imagga
created on 2022-01-08

barbershop 81.4
shop 68.9
mercantile establishment 53.2
place of business 35.4
people 25.1
man 20.9
establishment 17.7
person 17.4
business 14.6
team 14.3
window 14.2
male 14.2
silhouette 14.1
group 13.7
film 12.6
crowd 12.5
design 12.4
adult 12
black 11.4
home 11.2
men 11.2
negative 10.8
salon 10.7
family 10.7
grunge 10.2
light 10
city 10
music 9.9
art 9.6
couple 9.6
women 9.5
house 9.2
room 9.2
modern 9.1
building 9
portrait 9
urban 8.7
glass 8.6
architecture 8.6
old 8.4
teamwork 8.3
vintage 8.3
occupation 8.2
retro 8.2
businesswoman 8.2
happy 8.1
interior 8
working 7.9
businessman 7.9
work 7.8
musical 7.7
professional 7.6
fashion 7.5
human 7.5
mother 7.4
equipment 7.4
indoor 7.3
decoration 7.2
smiling 7.2
worker 7.2
science 7.1
gymnasium 7.1
indoors 7
world 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 99.4
person 91.8
indoor 91.1
clothing 88.4
drawing 77
cartoon 70.8
group 59.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 94.6%
Calm 60.4%
Sad 28.2%
Happy 4.3%
Confused 4.2%
Angry 0.9%
Disgusted 0.8%
Surprised 0.7%
Fear 0.5%

AWS Rekognition

Age 37-45
Gender Female, 98.3%
Calm 99.3%
Sad 0.3%
Happy 0.2%
Confused 0.1%
Surprised 0.1%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 45-51
Gender Male, 51.4%
Sad 85.6%
Happy 9.9%
Calm 3.8%
Confused 0.2%
Surprised 0.2%
Angry 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 36-44
Gender Female, 98.4%
Calm 94.7%
Sad 2.5%
Happy 2.2%
Surprised 0.2%
Disgusted 0.1%
Angry 0.1%
Confused 0.1%
Fear 0.1%

AWS Rekognition

Age 48-54
Gender Female, 87.4%
Calm 93%
Sad 4.3%
Happy 1.1%
Confused 0.7%
Angry 0.3%
Surprised 0.3%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 23-33
Gender Female, 96.6%
Calm 97.4%
Sad 1.6%
Confused 0.4%
Angry 0.3%
Happy 0.1%
Surprised 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 33-41
Gender Female, 56.4%
Calm 74.9%
Sad 19%
Happy 2.3%
Confused 1.6%
Angry 0.6%
Disgusted 0.6%
Fear 0.5%
Surprised 0.4%

AWS Rekognition

Age 28-38
Gender Male, 83.4%
Calm 67.6%
Confused 11.1%
Sad 8.6%
Happy 6.8%
Angry 2.6%
Surprised 1.7%
Disgusted 0.9%
Fear 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%

Categories

Imagga

paintings art 77.4%
text visuals 21.1%
interior objects 1.4%

Text analysis

Amazon

43946
TUPPERWARE
-
el - - - -
- Valles
el
Valles