Human Generated Data

Title

Untitled (several adults sitting in living room watching television)

Date

1948

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14550

Human Generated Data

Title

Untitled (several adults sitting in living room watching television)

People

Artist: Jack Gould, American

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14550

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99.4
Human 99.4
Person 99
Person 98.7
Person 98.7
Clinic 98.4
Person 97.6
Person 97.4
Hospital 95
Operating Theatre 94.8
Person 86.9
Room 76.4
Indoors 76.4
Furniture 61.9
Bed 55.9

Clarifai
created on 2023-10-29

people 99.9
group 97.9
child 97.3
man 96.4
adult 96.3
woman 96.2
group together 94.7
sit 92.8
indoors 91
chair 90
administration 87.3
room 85.9
family 84.3
wear 83.3
four 80.9
leader 80.7
education 80.4
sitting 80.1
veil 76.8
actor 74.6

Imagga
created on 2022-01-29

barbershop 69.5
shop 53.7
mercantile establishment 41.8
home 39.1
hospital 37.9
patient 36.2
man 35
person 33.1
people 32.9
chair 29.2
salon 28.7
place of business 27.9
room 27.7
indoors 27.2
adult 26.1
male 25.5
happy 25.1
nurse 22.3
medical 21.2
senior 20.6
family 20.5
couple 20
interior 17.7
couch 17.4
smiling 17.4
sitting 17.2
barber chair 16.8
health 16.7
happiness 15.7
men 15.5
doctor 15
portrait 14.9
care 14.8
cheerful 14.6
indoor 14.6
hairdresser 14.3
smile 14.3
medicine 14.1
professional 14
establishment 13.9
lifestyle 13
women 12.7
two people 12.6
work 12.6
house 12.5
child 12.4
bed 12.3
seat 12
inside 12
20s 11.9
sick person 11.5
clinic 11.4
case 11.3
love 11
mother 10.9
retired 10.7
kid 10.6
lady 10.6
elderly 10.5
furniture 10.5
loving 10.5
illness 10.5
uniform 10.5
mature 10.2
occupation 10.1
horizontal 10.1
face 9.9
father 9.9
old 9.8
check 9.7
resting 9.5
talking 9.5
adults 9.5
camera 9.2
worker 8.9
70s 8.9
office 8.8
together 8.8
break 8.6
husband 8.6
treatment 8.3
dress 8.1
life 8
to 8
facing camera 7.9
living room 7.8
cups 7.8
daughter 7.7
sick 7.7
retirement 7.7
casual 7.6
two 7.6
living 7.6
enjoying 7.6
joy 7.5
relationship 7.5
emotion 7.4
joyful 7.4
aged 7.2
looking 7.2
working 7.1
modern 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

indoor 91.9
house 90.4
person 88.3
clothing 86.3
furniture 83.4
text 82.8
man 81.3
table 76
human face 54.1
old 40.7
clothes 17.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 13-21
Gender Male, 79%
Calm 86.9%
Sad 10.6%
Confused 0.8%
Angry 0.6%
Surprised 0.4%
Fear 0.3%
Happy 0.3%
Disgusted 0.2%

AWS Rekognition

Age 42-50
Gender Male, 99.5%
Calm 99.8%
Surprised 0%
Sad 0%
Happy 0%
Disgusted 0%
Angry 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 31-41
Gender Male, 99.1%
Calm 99.6%
Sad 0.2%
Happy 0.1%
Disgusted 0%
Confused 0%
Fear 0%
Surprised 0%
Angry 0%

AWS Rekognition

Age 21-29
Gender Female, 99.4%
Disgusted 36.7%
Calm 26.5%
Confused 12.3%
Sad 10.6%
Angry 6.6%
Happy 2.8%
Fear 2.3%
Surprised 2.2%

AWS Rekognition

Age 29-39
Gender Male, 90.6%
Calm 71.4%
Sad 20.7%
Happy 2.7%
Confused 2.4%
Disgusted 1.1%
Angry 0.7%
Fear 0.7%
Surprised 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.4%
Person 99%
Person 98.7%
Person 98.7%
Person 97.6%
Person 97.4%
Person 86.9%

Categories

Imagga

paintings art 98.8%

Text analysis

Google

MJI7 -- YT37A°2 - - XAGO
MJI7
--
YT37A°2
-
XAGO