Human Generated Data

Title

Untitled (man looking in woman's ear inside medical trailer)

Date

1961

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9689

Human Generated Data

Title

Untitled (man looking in woman's ear inside medical trailer)

People

Artist: Martin Schweig, American 20th century

Date

1961

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.5
Human 99.5
Person 99.4
Clothing 98.7
Apparel 98.7
Shorts 75.4
Chair 73.1
Furniture 73.1
Weapon 72.5
Weaponry 72.5
Hat 64.7
Skirt 60.5
Gun 57.7
Cap 57.4

Imagga
created on 2022-01-23

person 28.2
barbershop 26.9
adult 25.4
man 24.8
people 23.4
shop 22.9
happy 21.9
smiling 20.2
chair 19.4
portrait 18.7
male 17.7
fashion 17.3
model 17.1
attractive 16.8
mercantile establishment 16.7
lifestyle 16.6
smile 15.7
sexy 15.2
black 13.9
sitting 13.7
youth 13.6
pretty 13.3
health 13.2
lady 13
seat 12.8
blond 12.7
hairdresser 12.7
indoors 12.3
senior 12.2
clothing 11.9
hair 11.9
equipment 11.8
wheelchair 11.4
cheerful 11.4
place of business 11.2
one 11.2
men 11.2
body 10.4
happiness 10.2
casual 10.2
cute 10
professional 10
playing 10
active 9.9
handsome 9.8
human 9.7
looking 9.6
elderly 9.6
gym 9.6
barber chair 9.5
women 9.5
mature 9.3
salon 9.3
elegance 9.2
music 9
fun 9
furniture 8.8
working 8.8
medical 8.8
standing 8.7
play 8.6
face 8.5
relaxation 8.4
old 8.4
leisure 8.3
worker 8.3
holding 8.2
care 8.2
healthy 8.2
posing 8
instrument 7.9
couple 7.8
retired 7.7
modern 7.7
device 7.7
retirement 7.7
musician 7.6
clothes 7.5
office 7.4
20s 7.3
alone 7.3
sensual 7.3
fitness 7.2
dress 7.2
home 7.2
job 7.1
together 7
bag 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 98.6
standing 92.3
clothing 81.3
text 74.5
black and white 61.6

Face analysis

Amazon

AWS Rekognition

Age 38-46
Gender Male, 96.9%
Calm 76.9%
Happy 16.1%
Surprised 2.1%
Disgusted 1.4%
Confused 1.2%
Sad 1.1%
Angry 0.6%
Fear 0.5%

AWS Rekognition

Age 13-21
Gender Female, 76%
Calm 91.5%
Disgusted 2.9%
Happy 1.8%
Sad 1.2%
Confused 0.9%
Angry 0.9%
Surprised 0.6%
Fear 0.2%

AWS Rekognition

Age 26-36
Gender Female, 53.8%
Sad 88.3%
Calm 9.2%
Happy 1.3%
Confused 0.3%
Angry 0.3%
Disgusted 0.3%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 19-27
Gender Male, 87.6%
Fear 64.7%
Calm 15.8%
Happy 5.3%
Sad 5%
Disgusted 2.9%
Angry 2.7%
Surprised 2.2%
Confused 1.4%

AWS Rekognition

Age 18-24
Gender Female, 79.1%
Calm 83.6%
Surprised 8.4%
Happy 3.6%
Disgusted 2.5%
Fear 0.9%
Sad 0.6%
Angry 0.3%
Confused 0.2%

Feature analysis

Amazon

Person 99.5%
Skirt 60.5%

Captions

Microsoft

a man and a woman standing in front of a curtain 58.6%
a person standing in front of a curtain 58.5%
a person standing in front of a curtain 58.4%