Human Generated Data

Title

Untitled (two men and one woman on couch)

Date

1952

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20282

Human Generated Data

Title

Untitled (two men and one woman on couch)

People

Artist: Peter James Studio, American

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20282

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 98.3
Human 98.3
Person 96.6
Furniture 95.4
Chair 93.9
Person 93.2
Sitting 89.4
Clothing 84.9
Apparel 84.9
Person 63
Portrait 61.6
Face 61.6
Photography 61.6
Photo 61.6
Table 58.4

Clarifai
created on 2023-10-22

people 99.8
adult 98.3
woman 97.6
group 97.1
medical practitioner 96.5
man 96.4
furniture 94.9
sit 94.4
three 93.6
group together 93.4
chair 93.1
two 93.1
four 90.8
administration 89.9
leader 87.4
elderly 86.3
room 86.2
facial expression 81.1
seat 80.7
five 79.9

Imagga
created on 2022-03-05

patient 100
person 94.9
sick person 92.8
case 91.9
man 46.4
male 37.6
senior 36.6
indoors 33.4
nurse 33.1
people 32.9
sitting 27.5
home 27.2
smiling 25.3
adult 24.9
happy 23.8
couple 22.7
elderly 22
men 21.5
room 21
together 20.2
medical 19.4
businessman 19.4
office 19.3
hospital 18.8
mature 18.6
working 18.6
business 18.2
retired 17.5
30s 17.3
talking 17.1
colleagues 16.5
inside 15.7
40s 15.6
two people 15.6
retirement 15.4
table 14.7
mid adult 14.5
casual 14.4
grandfather 14.4
meeting 14.1
health 13.9
occupation 13.8
lifestyle 13.7
professional 13.7
portrait 13.6
women 13.5
camera 12.9
computer 12.8
70s 12.8
middle aged 12.7
day 12.6
illness 12.4
businesspeople 12.3
doctor 12.2
cheerful 12.2
group 12.1
teamwork 12.1
old 11.9
businesswoman 11.8
50s 11.8
casual clothing 11.7
discussion 11.7
thirties 11.7
worker 11.6
desk 11.3
happiness 11
laptop 10.9
team 10.8
care 10.7
smile 10.7
color 10.6
four 10.6
education 10.4
work 10.2
lab coat 10.1
horizontal 10.1
holding 9.9
coat 9.8
older 9.7
looking 9.6
specialist 9.6
standing 9.6
husband 9.6
teacher 9.1
indoor 9.1
classroom 9
fifties 8.9
to 8.9
coworkers 8.8
caring 8.8
60s 8.8
half length 8.8
sick 8.7
daytime 8.7
bed 8.6
relaxed 8.5
clinic 8.4
job 8
four people 7.9
seventies 7.9
discussing 7.9
helping 7.9
forties 7.9
class 7.7
angle 7.7
looking camera 7.7
two 7.6
wife 7.6
communication 7.6
20s 7.3
student 7.2
family 7.1
face 7.1
interior 7.1
medicine 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 99.4
person 88
clothing 85.3
black and white 76.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 98.4%
Calm 98.9%
Disgusted 0.3%
Happy 0.2%
Sad 0.2%
Surprised 0.1%
Angry 0.1%
Confused 0.1%
Fear 0.1%

AWS Rekognition

Age 45-53
Gender Male, 99.4%
Happy 82.7%
Confused 6%
Surprised 3%
Angry 2.6%
Calm 2.1%
Disgusted 1.7%
Sad 1.6%
Fear 0.3%

AWS Rekognition

Age 33-41
Gender Male, 99.9%
Calm 61%
Happy 19.6%
Confused 12%
Surprised 2.6%
Disgusted 1.8%
Sad 1.2%
Angry 1%
Fear 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Person 98.3%
Person 96.6%
Person 93.2%
Person 63%
Chair 93.9%

Categories

Imagga

people portraits 78.1%
paintings art 21.3%

Captions

Text analysis

Amazon

THE
oe

Google

347 LZ THE YT33A°2-XAO
347
LZ
THE
YT33A°2-XAO