Human Generated Data

Title

Untitled (couples talking and dancing at party)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7709

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couples talking and dancing at party)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7709

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.7
Human 99.7
Person 99.7
Person 99.7
Person 99.4
Person 99.4
Person 99
Person 98.6
Person 97.9
Person 97.6
Clothing 95.6
Apparel 95.6
Indoors 91.1
Person 90.7
Person 89.4
Room 87.6
Crowd 83.9
Overcoat 83.3
Suit 83.3
Coat 83.3
Person 79
Audience 77.1
People 75.2
Person 73.5
Furniture 70.6
Text 67.3
Photography 62.8
Photo 62.8
Classroom 60.2
School 60.2
Fashion 59.2
Gown 59.2
Tuxedo 59
Female 58.1
Robe 58.1
Face 57.2
Sitting 55.6

Clarifai
created on 2023-10-26

people 99.8
group 99.6
woman 98.1
many 97.6
man 97.4
adult 97
teacher 95.7
music 95.6
group together 95.3
monochrome 94.7
education 94
child 93.6
audience 93.2
classroom 91.9
musician 91.7
elementary school 90.9
school 87.4
leader 84.3
furniture 83.7
administration 83.6

Imagga
created on 2022-01-09

teacher 41.3
person 39.6
man 38.3
businessman 35.3
people 33.5
male 33.3
adult 32.3
business 31.6
office 28
student 25.3
classroom 24.4
group 24.2
professional 23.8
education 23.4
room 23.1
table 22.5
blackboard 20.1
work 19.6
class 19.3
meeting 17.9
computer 17.8
sitting 17.2
desk 17
teaching 16.6
men 16.3
happy 16.3
hand 15.9
educator 15.8
school 15.8
modern 15.4
executive 15.4
board 15.4
smiling 15.2
communication 15.1
job 15
corporate 14.6
laptop 14.6
businesspeople 14.2
team 13.4
working 13.3
indoors 13.2
manager 13
businesswoman 12.7
women 12.7
worker 12.6
chair 12.5
studying 12.5
entrepreneur 12.1
looking 12
occupation 11.9
technology 11.9
smile 11.4
hands 11.3
study 11.2
teamwork 11.1
indoor 11
stage 10.9
cheerful 10.6
success 10.5
seminar 9.8
human 9.7
colleagues 9.7
portrait 9.7
musical instrument 9.5
college 9.5
happiness 9.4
two 9.3
mature 9.3
successful 9.2
confident 9.1
handsome 8.9
employee 8.9
math 8.8
mathematics 8.8
together 8.8
couple 8.7
boy 8.7
diversity 8.6
adults 8.5
hall 8.3
back 8.3
holding 8.3
lecture 7.9
life 7.8
enrollee 7.8
standing 7.8
conference 7.8
students 7.8
40s 7.8
discussion 7.8
mid adult 7.7
building 7.7
casual 7.6
workplace 7.6
finance 7.6
talking 7.6
learning 7.5
equipment 7.5
screen 7.5
presentation 7.4
wind instrument 7.3
lifestyle 7.2
black 7.2
day 7.1
restaurant 7
glass 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

person 98.6
text 95.9
clothing 85.2
woman 76.9
wedding dress 64.2
man 63.4
people 63
dress 58.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Male, 99.3%
Happy 54.5%
Sad 27.7%
Calm 10.1%
Confused 4.6%
Disgusted 1.1%
Surprised 0.8%
Fear 0.7%
Angry 0.5%

AWS Rekognition

Age 37-45
Gender Female, 67.8%
Calm 85.8%
Happy 3.6%
Surprised 3.6%
Sad 2.6%
Confused 1.7%
Angry 1.6%
Disgusted 0.8%
Fear 0.3%

AWS Rekognition

Age 30-40
Gender Male, 98%
Calm 96.7%
Sad 1.8%
Happy 0.5%
Confused 0.4%
Disgusted 0.2%
Surprised 0.2%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 33-41
Gender Male, 95.8%
Sad 95.7%
Calm 1.8%
Confused 1.3%
Happy 0.5%
Surprised 0.2%
Fear 0.2%
Disgusted 0.2%
Angry 0.1%

AWS Rekognition

Age 30-40
Gender Male, 99.9%
Calm 71.9%
Sad 23.7%
Happy 1.6%
Surprised 1%
Confused 0.8%
Disgusted 0.6%
Angry 0.3%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Categories

Text analysis

Amazon

17178
DE
DE TALE
NAMTRAT
NAGOY NAMTRAT
NAGOY
19/78
TALE
WTX

Google

17/78.
17/78.