Human Generated Data

Title

Untitled (two little girls in matching dresses and hats)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17334

Human Generated Data

Title

Untitled (two little girls in matching dresses and hats)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 99.5
Apparel 99.5
Person 99.3
Human 99.3
Dress 96.3
Face 95.8
Female 94
Person 91.6
Person 85.9
Shorts 85.2
Car 84.7
Transportation 84.7
Vehicle 84.7
Automobile 84.7
Person 83.4
Footwear 82.9
Shoe 82.9
Costume 80.5
Photo 79.2
Photography 79.2
Portrait 79.2
Girl 78.4
Outdoors 76
Woman 74.9
Kid 74.5
Child 74.5
Glasses 73.5
Accessories 73.5
Accessory 73.5
Person 72.5
Person 72.4
Nature 69.5
Goggles 66.2
Play 64.4
Leisure Activities 59.8
Indoors 59.1
Water 58.7
Floor 57
Pool 56.2
Shoe 55.3
Furniture 55.1

Imagga
created on 2022-02-26

man 31.6
male 29.2
people 27.3
person 26.6
coat 24.4
groom 24.3
lab coat 23.6
adult 23
professional 22.6
happy 20.7
businessman 17.7
men 17.2
business 17
bride 16.6
couple 16.5
portrait 16.2
medical 15.9
wedding 15.6
dress 15.4
clothing 14.8
work 14.1
smiling 13.7
smile 13.5
love 13.4
happiness 13.3
attractive 13.3
job 13.3
patient 13.2
doctor 13.1
corporate 12.9
looking 12.8
hospital 12.8
office 12
occupation 11.9
health 11.8
nurse 11.5
garment 11.1
two 11
medicine 10.6
indoors 10.5
businesswoman 10
pretty 9.8
human 9.7
together 9.6
home 9.6
cute 9.3
suit 9.1
holding 9.1
life 9.1
care 9
team 9
worker 8.9
building 8.8
lab 8.7
brunette 8.7
day 8.6
model 8.6
marriage 8.5
emotion 8.3
fashion 8.3
room 8.1
romantic 8
interior 8
hair 7.9
women 7.9
child 7.8
face 7.8
modern 7.7
test 7.7
married 7.7
hand 7.6
teacher 7.4
black 7.2
world 7.1
husband 7.1
handsome 7.1
working 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 95.5
black and white 89.8
person 85.7
drawing 50.3

Face analysis

Amazon

AWS Rekognition

Age 20-28
Gender Female, 83.1%
Fear 51.4%
Disgusted 18.9%
Sad 11.5%
Calm 7.7%
Confused 3.3%
Surprised 3%
Angry 2.6%
Happy 1.7%

AWS Rekognition

Age 18-26
Gender Male, 98%
Surprised 87.9%
Happy 4.5%
Calm 3.3%
Disgusted 1.8%
Angry 1.1%
Fear 0.9%
Sad 0.3%
Confused 0.2%

Feature analysis

Amazon

Person 99.3%
Car 84.7%
Shoe 82.9%

Captions

Microsoft

a person posing for the camera 72.8%
a girl posing for a photo 34.8%
an old photo of a person 34.7%

Text analysis

Amazon

MJ17

Google

YT3RA2
MJ17
0ɔ2
MJ17 YT3RA2 0ɔ2