Human Generated Data

Title

Untitled (baby girl playing with kitchen pot on living room floor)

Date

c. 1940-1962

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9887

Human Generated Data

Title

Untitled (baby girl playing with kitchen pot on living room floor)

People

Artist: Martin Schweig, American 20th century

Date

c. 1940-1962

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Person 99.1
Human 99.1
Furniture 97.4
Chair 96.7
Baby 92.6
Floor 82.6
Bird 82.1
Animal 82.1
Sitting 81.4
Flooring 77.9
Indoors 77.4
Bird 71.6
Living Room 71.5
Room 71.5
Kid 71.1
Child 71.1
Bird 70.6
Photography 63.3
Photo 63.3
Clothing 62.2
Apparel 62.2
Screen 56.6
Electronics 56.6
Tire 56.3
Urban 55.9
Road 55.9
Building 55.9
Street 55.9
City 55.9
Town 55.9
Couch 55.5
Crawling 55.3

Imagga
created on 2022-01-28

man 29.6
newspaper 29.2
person 28.5
people 26.8
product 22.4
male 22.1
adult 19.7
sitting 18.9
creation 17.4
casual 16.9
portrait 16.2
lifestyle 15.9
happy 15.7
business 15.2
grandfather 15.1
businessman 15
outdoors 14.9
scholar 14.7
youth 14.5
day 14.1
room 14.1
couple 13.9
alone 13.7
grandma 13.4
working 13.3
happiness 12.5
cheerful 12.2
looking 12
one 11.9
intellectual 11.8
indoors 11.4
office 11.2
love 11
smiling 10.8
holding 10.7
attractive 10.5
child 10.4
home 10.4
women 10.3
black 10.2
relax 10.1
indoor 10
joy 10
smile 10
fashion 9.8
computer 9.7
urban 9.6
men 9.4
work 9.4
model 9.3
musical instrument 9.3
relaxation 9.2
back 9.2
city 9.1
pretty 9.1
chair 9
handsome 8.9
full length 8.7
life 8.7
boy 8.7
standing 8.7
school 8.7
cute 8.6
face 8.5
two 8.5
finance 8.4
park 8.2
fun 8.2
lady 8.1
job 8
little 7.9
education 7.8
color 7.8
studying 7.7
bench 7.6
friends 7.5
human 7.5
leisure 7.5
building 7.5
silhouette 7.5
floor 7.4
parent 7.3
copy space 7.2
hair 7.1
worker 7.1
interior 7.1
together 7

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

window 96.7
text 93.5
indoor 91.6
bed 86.3
black and white 82.3
toddler 73.5

Face analysis

Google

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Bird 82.1%
Couch 55.5%

Captions

Microsoft

a person sitting on a bed looking out a window 67.2%
a person sitting on a bed next to a window 65.7%
a person sitting in front of a window 65.6%

Text analysis

Amazon

MJIR
2$
45021
YT3RAB

Google

A3
YT3
MJ1R YT3ㅋA3
MJ1R