Human Generated Data

Title

Untitled (little girl holding baby)

Date

c. 1950

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19728

Human Generated Data

Title

Untitled (little girl holding baby)

People

Artist: Martin Schweig, American 20th century

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 96.8
Human 96.8
Furniture 95.5
Apparel 94.4
Clothing 94.4
Person 93.6
Female 91.3
Chair 86.2
Indoors 83.3
Couch 80.4
Room 79.4
Living Room 78.7
Floor 77.7
Flooring 76.6
Woman 76.1
Girl 71.7
Dress 71.1
Face 68.6
Portrait 68.6
Photography 68.6
Photo 68.6
People 65.3
Kid 63.2
Child 63.2
Baby 59.7
Dining Table 59.5
Table 59.5

Imagga
created on 2022-03-05

person 34.1
adult 31.8
people 29.6
man 27.6
hairdresser 25.9
blackboard 24.4
teacher 22.4
professional 22.1
black 21
male 20.6
salon 18.5
portrait 18.1
happy 16.9
human 15
face 14.9
women 14.2
pretty 14
attractive 14
education 13.9
style 12.6
work 12.6
lifestyle 12.3
smile 12.1
brass 12.1
educator 12.1
hair 11.9
student 11.8
business 11.5
fashion 11.3
couple 10.5
one 10.5
sexy 10.4
men 10.3
device 9.9
holding 9.9
class 9.6
looking 9.6
casual 9.3
equipment 9.2
occupation 9.2
modern 9.1
music 9.1
lady 8.9
science 8.9
brunette 8.7
love 8.7
wind instrument 8.6
musical instrument 8.6
exam 8.6
happiness 8.6
cute 8.6
elegance 8.4
study 8.4
color 8.3
school 8.2
cheerful 8.1
office 8
body 8
smiling 8
businessman 7.9
medical 7.9
photographer 7.9
glass 7.9
chalkboard 7.8
hands 7.8
model 7.8
sitting 7.7
cornet 7.7
expression 7.7
university 7.6
health 7.6
room 7.6
hand 7.6
college 7.6
patient 7.5
blond 7.5
technology 7.4
suit 7.4
board 7.2
dress 7.2
worker 7.2
handsome 7.1
romantic 7.1
posing 7.1
working 7.1
indoors 7
look 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 93.2
black and white 92.7
clothing 88.6
person 87.4
indoor 85.9

Face analysis

Amazon

AWS Rekognition

Age 21-29
Gender Female, 81.6%
Calm 62.5%
Sad 36.4%
Happy 0.4%
Confused 0.3%
Angry 0.2%
Disgusted 0.1%
Fear 0.1%
Surprised 0.1%

Feature analysis

Amazon

Person 96.8%

Captions

Microsoft

a man and a woman standing in front of a window 39.9%
a person standing in front of a window 39.8%
a group of people standing in front of a window 39.7%

Text analysis

Amazon

25233
KODAKE1A--1T

Google

O MJI7--YT3RA 2--A
2--A
O
MJI7--YT3RA