Human Generated Data

Title

Untitled (nine employees posing inside produce department)

Date

c. 1950

People

Artist: Jack Rodden Studio, American 1914 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13508

Human Generated Data

Title

Untitled (nine employees posing inside produce department)

People

Artist: Jack Rodden Studio, American 1914 - 2016

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-04

Person 98.8
Human 98.8
Person 98.7
Person 97.2
Person 95.4
Person 95
Person 90.2
Person 89
Person 86.2
Apparel 84.8
Clothing 84.8
Person 70.9
People 66.8
Crowd 64.8
Face 63.8
Photography 61.3
Photo 61.3
Shoe 59.4
Footwear 59.4
Outdoors 58.3
Sitting 57.4
Female 56.2
Nature 56

Imagga
created on 2022-02-04

blackboard 62.7
teacher 43.6
classroom 38.5
person 35.9
room 35.5
man 34.9
people 31.2
male 30.5
business 28.5
adult 26.2
laptop 25.9
group 25
professional 24.3
educator 21.8
education 21.7
businessman 21.2
office 20.2
computer 20.1
student 19.9
work 19.6
modern 18.2
women 18.2
men 18
job 17.7
happy 17.5
table 17.3
school 17.2
corporate 16.3
indoors 15.8
team 15.2
smiling 15.2
musical instrument 14.7
indoor 14.6
board 14.5
smile 14.3
chair 14.2
desk 14.2
meeting 14.1
sitting 13.7
class 13.5
lifestyle 13
cheerful 13
percussion instrument 12.8
businesswoman 12.7
technology 12.6
communication 11.8
students 11.7
interior 11.5
executive 11.5
couple 11.3
confident 10.9
holding 10.7
stage 10.7
hand 10.6
looking 10.4
black 10.2
happiness 10.2
house 10
steel drum 9.7
teaching 9.7
portrait 9.7
employee 9.7
success 9.7
musician 9.6
home 9.6
boy 9.6
study 9.3
casual 9.3
manager 9.3
teamwork 9.3
worker 9
handsome 8.9
to 8.9
life 8.8
math 8.8
businesspeople 8.5
college 8.5
mature 8.4
occupation 8.2
lady 8.1
suit 8.1
working 8
day 7.8
mathematics 7.8
conference 7.8
full length 7.8
leader 7.7
notebook 7.7
studying 7.7
two 7.6
career 7.6
silhouette 7.5
phone 7.4
design 7.3
music 7.2
color 7.2
speaker 7.2
platform 7.1
together 7

Google
created on 2022-02-04

Black 89.7
Black-and-white 85.9
Style 84
Adaptation 79.3
Chair 76.7
Monochrome photography 76.5
Art 76.4
Couch 76.1
Snapshot 74.3
Font 73.9
Event 73.6
Monochrome 73.2
Suit 66.6
Visual arts 64.5
Stock photography 63.5
Room 61.3
Music 60.6
Illustration 59.7
Display device 59.2
Sitting 57.5

Microsoft
created on 2022-02-04

text 99.2
person 93.1
clothing 87.5
cartoon 75.3
drawing 71.1
man 60.4

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 98.2%
Calm 51.2%
Happy 32%
Surprised 8.1%
Sad 3.6%
Confused 1.6%
Angry 1.5%
Disgusted 1.3%
Fear 0.8%

AWS Rekognition

Age 43-51
Gender Male, 65.3%
Happy 95.1%
Calm 2.3%
Surprised 1.6%
Sad 0.3%
Disgusted 0.2%
Fear 0.2%
Confused 0.1%
Angry 0.1%

AWS Rekognition

Age 31-41
Gender Male, 99.8%
Sad 49.3%
Calm 33.2%
Surprised 5.5%
Confused 4.4%
Disgusted 2.5%
Happy 2.2%
Fear 2%
Angry 0.9%

AWS Rekognition

Age 27-37
Gender Female, 50.9%
Sad 82.8%
Happy 8.9%
Calm 4.7%
Angry 1.4%
Confused 0.8%
Disgusted 0.8%
Surprised 0.3%
Fear 0.3%

AWS Rekognition

Age 35-43
Gender Male, 97.8%
Calm 63.1%
Surprised 26.3%
Happy 7%
Confused 1.3%
Sad 0.8%
Angry 0.7%
Disgusted 0.6%
Fear 0.2%

AWS Rekognition

Age 30-40
Gender Male, 86.5%
Happy 86.7%
Sad 6.7%
Calm 1.6%
Confused 1.5%
Angry 1.4%
Fear 1.1%
Disgusted 0.6%
Surprised 0.5%

AWS Rekognition

Age 43-51
Gender Male, 99.9%
Calm 45.4%
Surprised 24.4%
Happy 8.6%
Sad 7.5%
Confused 5.3%
Fear 3.5%
Angry 2.9%
Disgusted 2.3%

AWS Rekognition

Age 26-36
Gender Female, 99.1%
Calm 39.3%
Happy 29.7%
Fear 9.4%
Sad 9.2%
Confused 5.2%
Angry 2.8%
Disgusted 2.7%
Surprised 1.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Shoe 59.4%

Captions

Microsoft

a group of people sitting at a desk 71.4%
a group of people in a room 71.3%
a group of people standing in front of a laptop 47.9%

Text analysis

Amazon

DAY
EVERY
FRESHNESS
R
rapes
neur
de
neur JORY
de SHEMIO
POTITIDES
JORY
SHEMIO
ЕТ

Google

-
MJI7--YT3RA°2
FRESHNESS MJI7--YT3RA°2 - - XAGOX
FRESHNESS
XAGOX