Human Generated Data

Title

Untitled (girls with large doll)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17852

Human Generated Data

Title

Untitled (girls with large doll)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 99.8
Apparel 99.8
Footwear 99.5
Shoe 99.5
Furniture 99.5
Chair 99.5
Person 97.5
Human 97.5
Person 96.7
Person 93.8
Dress 86.6
Female 85.6
Shorts 84.1
Accessory 77
Accessories 77
Woman 70.2
Necklace 60.5
Jewelry 60.5
Photo 60.1
Photography 60.1
Portrait 59.5
Face 59.5
Couch 56.4

Imagga
created on 2022-02-26

man 28.2
people 25.1
person 25
male 22
adult 15.9
sport 15.5
play 13.8
men 13.7
black 12.6
body 12
planner 11.9
ball 11.8
lifestyle 11.5
athlete 11.5
human 11.2
weapon 11.2
action 11.1
equipment 11
playing 10.9
game 10.7
fashion 10.5
outdoors 10.4
portrait 10.3
women 10.3
mask 9.7
costume 9.6
player 9.5
happy 9.4
model 9.3
leisure 9.1
fun 9
lady 8.9
technology 8.9
boy 8.7
smile 8.5
sword 8.5
face 8.5
power 8.4
old 8.3
breastplate 8.3
competition 8.2
style 8.1
active 7.9
standing 7.8
run 7.7
pretty 7.7
attractive 7.7
health 7.6
traditional 7.5
city 7.5
one 7.5
street 7.4
sexy 7.2
recreation 7.2
team 7.2
helmet 7.1
patient 7.1
to 7.1
medical 7.1
uniform 7

Google
created on 2022-02-26

Photograph 94.2
Black 89.7
Black-and-white 85.3
Gesture 85.3
Style 84.1
Happy 79.4
Snapshot 74.3
Monochrome photography 72.7
Art 72.1
Event 71.8
Monochrome 70.7
Vintage clothing 67.5
Street fashion 66.4
Fun 64.1
Sitting 62.5
Child 62.1
Pattern 59.9
Chair 57.9
Luggage and bags 57.8
Visual arts 56.1

Microsoft
created on 2022-02-26

dress 95.4
person 95.1
wedding dress 94
text 86.5
clothing 86.4
bride 86.2
dance 84.9
woman 83.2

Face analysis

Amazon

Google

AWS Rekognition

Age 10-18
Gender Male, 61.9%
Sad 41.6%
Calm 22%
Surprised 12.2%
Fear 11.2%
Happy 5.5%
Disgusted 2.9%
Angry 2.5%
Confused 2.1%

AWS Rekognition

Age 40-48
Gender Male, 97.4%
Fear 75.7%
Surprised 23.1%
Angry 0.6%
Happy 0.3%
Calm 0.2%
Sad 0.1%
Disgusted 0.1%
Confused 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Shoe 99.5%
Person 97.5%

Captions

Microsoft

a group of people playing frisbee in the air 53.3%
a group of people playing a game of frisbee 33.1%

Text analysis

Amazon

aau
RAGON
YES RAGON
YES