Human Generated Data

Title

Untitled (two women and man on couch)

Date

1952

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20283

Human Generated Data

Title

Untitled (two women and man on couch)

People

Artist: Peter James Studio, American

Date

1952

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.1
Human 99.1
Apparel 98.7
Clothing 98.7
Person 98.5
Person 94.1
Chair 91
Furniture 91
Female 78.2
Dress 72.9
People 71.3
Girl 61.9
Leisure Activities 61.9
Photography 61.5
Photo 61.5
Face 60.3
Woman 58.7
Footwear 57.4
Shoe 57.4

Imagga
created on 2022-03-05

patient 57.6
person 56.3
man 40.3
case 37.5
male 36.9
sick person 36.2
people 35.1
nurse 30.4
men 25.8
businessman 23.8
senior 23.4
sitting 23.2
happy 23.2
professional 22.5
adult 22.5
indoors 22
business 21.9
home 21.5
smiling 21
office 19.3
medical 18.5
working 17.7
talking 17.1
casual 16.9
worker 16.8
room 16.1
mature 15.8
health 15.3
women 15
smile 15
couple 14.8
team 14.3
hospital 14.3
work 14.1
cheerful 13.8
portrait 13.6
together 13.1
life 12.7
colleagues 12.6
day 12.6
handsome 12.5
elderly 12.4
job 12.4
group 12.1
teamwork 12.1
computer 12
corporate 12
suit 11.7
illness 11.4
desk 11.3
doctor 11.3
inside 11
laptop 10.9
businesswoman 10.9
40s 10.7
mid adult 10.6
30s 10.6
profession 10.5
looking 10.4
happiness 10.2
lifestyle 10.1
coat 10.1
occupation 10.1
color 10
holding 9.9
care 9.9
clinic 9.8
human 9.7
two people 9.7
grandfather 9.6
retirement 9.6
table 9.5
businesspeople 9.5
old 9.1
coworkers 8.8
medicine 8.8
casual clothing 8.8
older 8.7
retired 8.7
face 8.5
meeting 8.5
two 8.5
manager 8.4
camera 8.3
indoor 8.2
associates 7.9
70s 7.9
50s 7.8
60s 7.8
middle aged 7.8
sick 7.7
laboratory 7.7
modern 7.7
enjoying 7.6
mother 7.5
horizontal 7.5
relaxed 7.5
staff 7.2
love 7.1
to 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 97
person 96.2
outdoor 92.6
clothing 83.2
posing 77.8
smile 72.6
black and white 71.8
footwear 69.3
dance 58.3
old 42.7

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Male, 58%
Calm 62%
Happy 22.3%
Sad 7.9%
Surprised 2.9%
Confused 1.8%
Disgusted 1.3%
Fear 1.1%
Angry 0.8%

AWS Rekognition

Age 50-58
Gender Male, 99.8%
Sad 43%
Happy 39.7%
Angry 4.9%
Calm 4.3%
Confused 3.3%
Surprised 2.7%
Disgusted 1.2%
Fear 0.9%

AWS Rekognition

Age 33-41
Gender Female, 95.8%
Surprised 93.4%
Angry 1.7%
Fear 1.5%
Happy 1.1%
Calm 1%
Disgusted 0.7%
Confused 0.3%
Sad 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Shoe 57.4%

Captions

Microsoft

a group of people posing for a photo 91.5%
a group of people posing for the camera 91.4%
a man and woman posing for a photo 74.4%

Text analysis

Amazon

Ja