Human Generated Data

Title

Untitled (Tupperware party in livingroom)

Date

1958

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8022

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Tupperware party in livingroom)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1958

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8022

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.2
Human 99.2
Person 98.6
Person 98.6
Person 98.3
Person 97.8
Person 97.1
Person 96.7
Person 93.2
Person 92.7
Person 91
Classroom 90.5
Room 90.5
School 90.5
Indoors 90.5
Person 82.5
Clothing 78.2
Apparel 78.2
People 76.2
Restaurant 72.1
Cafeteria 69.5
Crowd 68.9
Person 65
Workshop 61.2

Clarifai
created on 2023-10-26

people 99.8
many 98.3
group 97.5
woman 97.4
man 97.2
adult 96
monochrome 95.1
group together 94.6
sit 93.1
child 91.2
crowd 90.7
sitting 87.9
party 84
wear 82.3
indoors 79.7
celebration 79.5
chair 79.5
enjoyment 79.2
meeting 78.4
nostalgia 78.2

Imagga
created on 2022-01-15

glass 20.1
beaker 19
party 14.6
celebration 14.3
table 14.3
jar 14
container 13.5
people 13.4
wind instrument 12.9
wedding 12.9
brass 12.8
man 12.8
person 12.7
sax 12.3
shop 12
drink 11.7
setting 11.6
science 11.6
business 11.5
vessel 11.5
restaurant 11.3
human 11.2
event 11.1
decoration 10.8
wine 10.7
chemistry 10.6
flowers 10.4
technology 10.4
banquet 10.2
musical instrument 9.7
dinner 9.7
lab 9.7
medical 9.7
laboratory 9.6
research 9.5
luxury 9.4
glasses 9.2
adult 9
champagne 8.9
reception 8.8
indoors 8.8
education 8.7
dining 8.6
biology 8.5
work 8.5
male 8.5
instrument 8.3
service 8.3
city 8.3
gold 8.2
alcohol 7.9
catering 7.8
worker 7.8
napkin 7.8
chemical 7.7
elegant 7.7
knife 7.7
fork 7.7
formal 7.6
mercantile establishment 7.6
equipment 7.5
cornet 7.4
student 7.2
black 7.2
holiday 7.2
working 7.1
medicine 7
modern 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.3
person 87.2
clothing 86
man 64.3
black and white 52.7
clothes 24.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Male, 91.1%
Surprised 58.8%
Calm 33.4%
Happy 4.2%
Sad 1.1%
Angry 0.8%
Disgusted 0.7%
Fear 0.6%
Confused 0.3%

AWS Rekognition

Age 25-35
Gender Male, 87.6%
Calm 81.3%
Happy 14.1%
Sad 3.3%
Confused 0.7%
Surprised 0.2%
Angry 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 33-41
Gender Male, 63.3%
Calm 80.4%
Confused 9.6%
Surprised 4.8%
Sad 2.7%
Happy 1.3%
Disgusted 0.5%
Angry 0.4%
Fear 0.3%

AWS Rekognition

Age 23-33
Gender Female, 93%
Calm 62.8%
Sad 33.2%
Confused 1.5%
Surprised 1.4%
Angry 0.4%
Fear 0.3%
Disgusted 0.2%
Happy 0.2%

AWS Rekognition

Age 35-43
Gender Female, 79.3%
Calm 99.1%
Sad 0.4%
Happy 0.2%
Surprised 0.1%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 34-42
Gender Female, 73.8%
Calm 97.6%
Surprised 0.7%
Happy 0.6%
Confused 0.5%
Sad 0.4%
Disgusted 0.1%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 29-39
Gender Female, 57.7%
Calm 80.1%
Confused 9.5%
Sad 7.7%
Happy 1.1%
Disgusted 0.6%
Surprised 0.5%
Angry 0.4%
Fear 0.2%

AWS Rekognition

Age 49-57
Gender Female, 85.7%
Calm 96.9%
Surprised 1%
Sad 0.6%
Confused 0.6%
Angry 0.3%
Disgusted 0.3%
Happy 0.2%
Fear 0%

AWS Rekognition

Age 30-40
Gender Male, 73.4%
Happy 72.3%
Calm 13.6%
Surprised 9.5%
Confused 1.3%
Sad 1.3%
Disgusted 0.9%
Fear 0.6%
Angry 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Categories

Imagga

paintings art 97.3%
text visuals 1.7%

Captions

Microsoft
created on 2022-01-15

text 93.2%

Text analysis

Amazon

FLORIDA
SARASOTA.
STEINMETZ. SARASOTA. FLORIDA
STEINMETZ.
43190

Google

43190. STEINMETZ. SARASOTA. FLORIDA
43190.
STEINMETZ.
SARASOTA.
FLORIDA