Human Generated Data

Title

Untitled (man and woman with children sitting in living room)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14500

Human Generated Data

Title

Untitled (man and woman with children sitting in living room)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Human 98.2
Person 98.2
Clinic 98.1
Person 94.6
Room 92.6
Indoors 92.6
Person 87.1
Person 84.4
Person 82.1
Hospital 80.2
Shoe 75.1
Footwear 75.1
Apparel 75.1
Clothing 75.1
Furniture 73.4
Person 73.2
Doctor 72.2
Operating Theatre 69.8
People 67.9
Baby 60.4
Newborn 58.3
Person 57.7
Nurse 57.1
Person 46.3
Person 44.2

Imagga
created on 2022-01-29

room 71.9
classroom 61.6
table 41.6
interior 31.9
chair 28.2
meeting 26.4
office 26.1
person 25.8
people 25.7
man 24.9
male 24.1
indoors 23.7
home 23.1
business 23.1
businessman 22.1
sitting 20.6
modern 20.3
group 20.2
restaurant 19.5
men 18.9
work 18.8
teacher 18.6
furniture 18.5
indoor 18.3
desk 17.9
house 17.6
women 17.4
smiling 17.4
musical instrument 17.3
executive 17.2
team 17
communication 16
together 15.8
laptop 15.7
presentation 14.9
professional 14.9
brass 14.7
conference 14.7
happy 14.4
suit 14.4
talking 14.3
design 13.5
adult 13.4
corporate 12.9
businesswoman 12.7
wind instrument 12.6
floor 12.1
computer 12.1
teamwork 12.1
inside 12
lifestyle 11.6
decor 11.5
working 11.5
dining 11.4
hall 11.4
education 11.3
manager 11.2
student 11.1
window 11
job 10.6
style 10.4
worker 10.4
happiness 10.2
glass 10.1
kitchen 10.1
smile 10
blackboard 9.8
colleagues 9.7
couch 9.7
workplace 9.5
contemporary 9.4
coffee 9.3
board 9.2
wine 9.2
drink 9.2
wood 9.2
food 9.1
confident 9.1
portrait 9.1
cheerful 8.9
chairs 8.8
couple 8.7
businesspeople 8.5
learning 8.5
holding 8.3
director 8.1
looking 8
architecture 7.8
percussion instrument 7.8
teaching 7.8
two people 7.8
dinner 7.8
class 7.7
leader 7.7
diversity 7.7
residential 7.7
drinking 7.7
sofa 7.7
boss 7.7
life 7.6
living 7.6
eating 7.6
plan 7.6
togetherness 7.6
eat 7.6
marimba 7.5
study 7.5
mature 7.4
service 7.4
educator 7.4
new 7.3
speaker 7.3
success 7.2
domestic 7.2
handsome 7.1
love 7.1

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 98.8
table 89.7
furniture 86.2
black and white 78.4
clothing 75.6
person 70.7

Face analysis

Amazon

Google

AWS Rekognition

Age 48-56
Gender Female, 56.4%
Happy 36.3%
Sad 30.9%
Calm 28.3%
Confused 2%
Disgusted 0.9%
Angry 0.7%
Surprised 0.5%
Fear 0.4%

AWS Rekognition

Age 29-39
Gender Female, 99.6%
Calm 88.7%
Sad 7.8%
Surprised 2.3%
Fear 0.6%
Disgusted 0.2%
Angry 0.2%
Happy 0.2%
Confused 0.1%

AWS Rekognition

Age 23-33
Gender Female, 90.1%
Calm 55%
Sad 43.5%
Angry 0.4%
Disgusted 0.4%
Confused 0.3%
Surprised 0.2%
Fear 0.1%
Happy 0.1%

AWS Rekognition

Age 38-46
Gender Female, 57.3%
Calm 63.4%
Sad 35.8%
Confused 0.4%
Surprised 0.1%
Disgusted 0.1%
Angry 0.1%
Fear 0%
Happy 0%

AWS Rekognition

Age 50-58
Gender Male, 95.2%
Happy 58.3%
Calm 19.9%
Sad 16.9%
Confused 1.9%
Fear 0.9%
Surprised 0.9%
Angry 0.8%
Disgusted 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.2%
Shoe 75.1%

Captions

Microsoft

a group of people in a room 87.3%
a group of people sitting at a table in front of a mirror 56.5%
a group of people sitting at a table 56.4%

Text analysis

Amazon

4
٤١٢w
YUSCO ٤١٢w
YUSCO

Google

MJIR
YT3RA
MJIR YT3RA