Human Generated Data

Title

Untitled (seven adults sitting in living room watching television)

Date

1948

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14269

Human Generated Data

Title

Untitled (seven adults sitting in living room watching television)

People

Artist: Jack Gould, American

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Human 99.7
Person 99.7
Person 99.4
Person 99.2
Person 98
Indoors 96.9
Room 96.9
Person 95.7
Clinic 93.8
Person 93.4
Person 92.3
Display 90.2
Screen 90.2
Electronics 90.2
Monitor 90.2
Living Room 80.2
Interior Design 79.2
Furniture 79.1
Person 75.6
People 63.4
LCD Screen 62.7
Couch 59.1
Hospital 59
Operating Theatre 59
TV 56.5
Television 56.5
Baby 56.4
Bedroom 55.3
Person 51.7

Imagga
created on 2022-01-29

hospital 69.2
barbershop 62.2
patient 51.9
shop 49.1
person 39
man 38.3
mercantile establishment 37.4
medical 33.6
people 32.9
room 32.4
indoors 30.8
male 28.4
home 27.1
adult 26.7
doctor 25.4
place of business 24.9
medicine 23.8
nurse 23.5
health 22.9
professional 19.8
surgeon 19.1
clinic 18.7
case 17.9
interior 17.7
care 17.3
work 17.3
sick person 17.3
office 17
happy 16.9
illness 16.2
senior 15.9
men 15.5
smiling 15.2
indoor 14.6
bed 14.2
couple 13.9
occupation 13.8
chair 13.6
horizontal 13.4
working 13.3
lifestyle 13
cheerful 13
treatment 12.9
sick 12.6
establishment 12.4
pain 12.4
specialist 12.4
uniform 12.4
sitting 12
inside 12
family 11.6
exam 11.5
talking 11.4
women 11.1
happiness 11
surgery 10.8
couch 10.6
job 10.6
child 10.4
house 10
smile 10
hairdresser 10
equipment 9.8
two people 9.7
portrait 9.1
to 8.9
computer 8.8
check 8.7
color 8.3
holding 8.3
team 8.1
worker 8
ward 7.9
assistant 7.8
disease 7.8
30s 7.7
profession 7.7
husband 7.6
two 7.6
resting 7.6
adults 7.6
mother 7.4
phone 7.4
table 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

person 95.2
indoor 94.6
house 93
table 92.3
text 88.8
clothing 85.4
furniture 82.8
old 64
man 63.8
people 55.8
bottle 55.5

Face analysis

Amazon

Google

AWS Rekognition

Age 18-24
Gender Female, 73.8%
Calm 99%
Sad 0.6%
Surprised 0.2%
Angry 0.1%
Disgusted 0.1%
Happy 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 24-34
Gender Female, 81.7%
Calm 98.7%
Sad 1%
Confused 0.1%
Happy 0.1%
Surprised 0.1%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 20-28
Gender Female, 59.9%
Sad 42.1%
Calm 39.5%
Confused 7%
Fear 3.6%
Happy 3.6%
Surprised 1.5%
Angry 1.5%
Disgusted 1.2%

AWS Rekognition

Age 21-29
Gender Female, 98.5%
Sad 33.5%
Disgusted 20.6%
Calm 17.4%
Angry 16.7%
Fear 5.4%
Confused 2.4%
Happy 2.4%
Surprised 1.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a group of people sitting on a bed 78.1%
a group of people sitting in a room 78%
a group of people in a room 77.9%

Text analysis

Amazon

M117--YT37--NAOX

Google

°
--
-YT
2
ヨー-YTヨョA°2-->
ヨー
ヨョ
>
A