Human Generated Data

Title

Untitled (woman examining holiday gift as three others watch)

Date

1948

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9133

Human Generated Data

Title

Untitled (woman examining holiday gift as three others watch)

People

Artist: Martin Schweig, American 20th century

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.4
Human 99.4
Person 98.8
Person 96.6
Person 95.5
Clothing 90.3
Apparel 90.3
Person 80.4
Indoors 73.4
Room 69.7
Head 61.5

Imagga
created on 2022-01-23

salon 67.9
hairdresser 64.9
barbershop 39.1
man 37
shop 33.3
people 32.9
male 28.4
home 27.9
adult 27.2
person 26
senior 25.3
indoors 24.6
mercantile establishment 23.4
couple 21.8
sitting 20.6
happy 20.1
room 19.3
lifestyle 18.1
men 18
family 17.8
indoor 16.4
casual 16.1
work 15.7
place of business 15.6
portrait 15.5
smiling 15.2
chair 15
mature 14.9
professional 13.9
20s 13.7
office 13.7
elderly 13.4
women 12.7
retirement 12.5
cheerful 12.2
smile 12.1
computer 12
camera 12
two 11.9
old 11.8
health 11.8
happiness 11.8
retired 11.6
patient 11.6
worker 11.6
medical 11.5
horizontal 10.9
60s 10.7
face 10.7
couch 10.6
together 10.5
doctor 10.3
business 10.3
70s 9.8
interior 9.7
businessman 9.7
talking 9.5
alone 9.1
clothing 9.1
working 8.8
looking 8.8
older 8.7
hospital 8.7
clinic 8.6
husband 8.6
mother 8.6
adults 8.5
inside 8.3
occupation 8.2
human 8.2
teacher 8.2
laptop 8.2
job 8
seniors 7.9
pensioner 7.8
living room 7.8
establishment 7.8
table 7.8
two people 7.8
father 7.8
attractive 7.7
loving 7.6
child 7.6
desk 7.6
togetherness 7.6
fashion 7.5
holding 7.4
life 7.4
phone 7.4
lady 7.3
aged 7.2
color 7.2
dress 7.2
handsome 7.1
love 7.1
medicine 7
nurse 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 99.3
text 96.7
clothing 90.4
black and white 70.7
wedding dress 62.3
man 52.2

Face analysis

Amazon

Google

AWS Rekognition

Age 42-50
Gender Male, 93.8%
Happy 98.4%
Sad 0.6%
Surprised 0.2%
Calm 0.2%
Confused 0.2%
Fear 0.2%
Angry 0.1%
Disgusted 0.1%

AWS Rekognition

Age 38-46
Gender Female, 84.9%
Happy 88.8%
Calm 7.3%
Surprised 1.2%
Fear 0.8%
Confused 0.5%
Angry 0.5%
Disgusted 0.5%
Sad 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a group of people looking at a laptop 70.2%
a group of people sitting in front of a window 64.2%
a group of people sitting in front of a laptop 61.5%

Text analysis

Amazon

3
8
MJIF
MJIF ОСЛИ
000
ОСЛИ

Google

MJ17 YT33A2 A
MJ17
YT33A2
A