Human Generated Data

Title

Untitled (man leaning back over side of couch)

Date

1955, printed later

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.162

Human Generated Data

Title

Untitled (man leaning back over side of couch)

People

Artist: Martin Schweig, American 20th century

Date

1955, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Furniture 99.8
Sitting 95
Human 95
Person 95
Living Room 83.5
Room 83.5
Indoors 83.5
Interior Design 83.3
Chair 82.3
Person 72.2
Couch 64.4
Person 64.2
Electronics 62.9
Screen 62.9
People 62.5
Cushion 62.4
Restaurant 58.3
Home Decor 58
Monitor 56.8
Display 56.8
Musician 56.5
Musical Instrument 56.5
Shelf 56.2

Imagga
created on 2022-01-08

child 40.9
happy 33.8
mother 33.7
family 32.9
male 31.5
home 29.5
people 29
man 26.2
happiness 25.1
parent 25
adult 24.7
smiling 24.6
person 23.8
love 22.9
daughter 22.4
kid 21.3
indoors 21.1
together 21
couple 20.9
sitting 20.6
women 20.6
father 19.2
boy 18.3
smile 17.8
cute 17.2
portrait 16.8
grandma 16.3
lifestyle 15.9
indoor 15.5
couch 15.5
cheerful 15.4
son 14.9
youth 14.5
casual 14.4
kin 14.3
husband 13.4
adorable 12.9
room 12.9
chair 12.3
togetherness 12.3
attractive 11.9
children 11.8
childhood 11.6
little 11.5
dad 11.2
shop 11.2
looking 11.2
mature 11.2
affection 10.6
loving 10.5
brother 10.5
fun 10.5
blond 10.5
old 10.4
senior 10.3
men 10.3
shoe shop 10.3
joy 10
holding 9.9
married 9.6
wife 9.5
enjoying 9.5
clothing 9.5
baby 9.4
face 9.2
20s 9.2
leisure 9.1
outdoors 9
interior 8.8
sofa 8.8
living room 8.8
elderly 8.6
toddler 8.6
living 8.5
females 8.5
horizontal 8.4
teen 8.3
teenager 8.2
group 8.1
handsome 8
day 7.8
brunette 7.8
chairs 7.8
laptop 7.8
affectionate 7.7
two 7.6
domestic 7.6
adults 7.6
house 7.5
relationship 7.5
park 7.4
care 7.4
mercantile establishment 7.4
playing 7.3
cup 7.2

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

person 99.3
text 99.2
clothing 96.3
baby 95.1
toddler 94.8
sitting 94.3
human face 92.2
black and white 73.8
furniture 73.3
child 61.6
old 53.4
boy 50.1

Face analysis

Amazon

Google

AWS Rekognition

Age 20-28
Gender Male, 99.9%
Happy 64.4%
Confused 12.9%
Sad 11.1%
Calm 5.3%
Disgusted 3.8%
Angry 1.1%
Fear 0.9%
Surprised 0.4%

AWS Rekognition

Age 25-35
Gender Female, 99.8%
Calm 82.2%
Surprised 4.6%
Angry 3.8%
Sad 2.9%
Confused 2.8%
Happy 2.1%
Disgusted 0.9%
Fear 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 95%
Couch 64.4%

Captions

Microsoft

a man and a woman sitting on a table 55.6%
a man and a woman sitting at a table 55.5%
a person sitting on a table 55.4%