Human Generated Data

Title

Untitled (little boy and dog)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17339

Human Generated Data

Title

Untitled (little boy and dog)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 99.9
Apparel 99.9
Person 97.1
Human 97.1
Animal 93.9
Cat 93.9
Pet 93.9
Mammal 93.9
Female 93.3
Dress 92.5
Face 89.4
Shoe 81.7
Footwear 81.7
Floor 81.1
Woman 80.1
Furniture 77.8
Food 76.9
Meal 76.9
Coat 75.9
Suit 75.9
Overcoat 75.9
Portrait 73.7
Photo 73.7
Photography 73.7
Fashion 72.2
Gown 72.2
Girl 71.2
Baby 70.5
Plant 68.9
Robe 68.2
Canine 66.6
Kid 65.9
Child 65.9
Grass 65.1
Cat 64.1
Couch 64.1
Chair 61.3
Man 58.8
Indoors 56.7
Shorts 56.6
Bridegroom 55.6
Wedding 55.6

Imagga
created on 2022-02-26

brassiere 100
woman's clothing 100
undergarment 100
clothing 100
consumer goods 77.4
covering 77.4
garment 77.3
dress 27.1
commodity 25.6
bride 24.9
wedding 22.1
love 21.3
married 18.2
black 16.8
portrait 16.2
celebration 15.9
negative 15.2
people 13.4
groom 13.3
marriage 13.3
film 13.2
sexy 12
person 11.8
gown 11.7
adult 11.6
face 11.4
couple 11.3
bouquet 11.3
flowers 11.3
women 11.1
wed 10.8
veil 10.8
romantic 10.7
one 10.4
happiness 10.2
traditional 10
bridal 9.7
ceremony 9.7
photographic paper 9.5
pretty 9.1
attractive 9.1
fashion 9
human 9
child 9
romance 8.9
hair 8.7
cute 8.6
bright 8.6
smile 8.5
male 8.5
art 8.5
mask 8.5
head 8.4
happy 8.1
man 8.1
body 8
look 7.9
hands 7.8
eyes 7.7
engagement 7.7
culture 7.7
flower 7.7
two 7.6
skin 7.6
hand 7.6
elegance 7.6
tradition 7.4
makeup 7.3
decoration 7.2
suit 7.2

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 99.5
book 90.5
cat 86.4
black and white 77.2
carnivore 68.7

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Female, 67.1%
Calm 51%
Happy 42.5%
Sad 1.7%
Fear 1.6%
Angry 1%
Surprised 1%
Disgusted 0.8%
Confused 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.1%
Cat 93.9%

Captions

Microsoft

a person holding a book 50.4%
a person standing next to a book 50.3%
a person standing in front of a book 47.1%

Text analysis

Amazon

6

Google

-YT3AA2-
MJI7- -YT3AA2- -XAO
-XAO
MJI7-