Human Generated Data

Title

Untitled (woman with two little girls)

Date

1958

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17519

Human Generated Data

Title

Untitled (woman with two little girls)

People

Artist: Lucian and Mary Brown, American

Date

1958

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Human 98.6
Person 98.6
Apparel 98.4
Clothing 98.4
Person 98.3
Footwear 97.8
Shoe 97.8
Floor 96.9
Flooring 95
Helmet 94.4
Dress 91.6
Female 91.3
Chair 87.9
Furniture 87.9
Person 87.7
Costume 82.6
Indoors 82.1
Child 81.4
Blonde 81.4
Woman 81.4
Teen 81.4
Girl 81.4
Kid 81.4
People 75.6
Shoe 74.5
Room 71.3
Photography 68.4
Portrait 68.4
Photo 68.4
Face 68.4
Living Room 61.3
Overcoat 60.1
Coat 60.1
Suit 60.1
Door 58.3
Shorts 56

Imagga
created on 2022-02-26

umbrella 56
canopy 43
shelter 33.2
dress 26.2
people 24.5
fashion 23.4
protective covering 22.3
adult 21.4
portrait 20.7
attractive 20.3
domestic 18.8
person 18.8
salon 17.6
women 17.4
lady 17
black 15.7
human 15.7
pretty 15.4
elegance 15.1
cute 15.1
male 14.9
happiness 14.9
man 14.8
happy 14.4
love 14.2
couple 13.9
groom 13.8
style 13.3
model 13.2
sexy 12.8
bride 12.5
posing 12.4
covering 12.3
wedding 11.9
hair 11.9
lifestyle 11.6
looking 11.2
men 11.2
casual 11
girls 10.9
holding 10.7
face 10.6
look 10.5
urban 10.5
legs 10.4
smile 10
city 10
gorgeous 10
one 9.7
elegant 9.4
blackboard 9.4
two 9.3
child 9.3
makeup 9.1
mother 9
romantic 8.9
family 8.9
lovely 8.9
body 8.8
celebration 8.8
brunette 8.7
flowers 8.7
modern 8.4
old 8.4
sensuality 8.2
cheerful 8.1
interior 8
holiday 7.9
standing 7.8
mall 7.8
dance 7.7
luxury 7.7
shop 7.7
married 7.7
life 7.6
head 7.6
bouquet 7.5
daughter 7.5
clothing 7.5
retro 7.4
blond 7.3
light 7.3
shopping 7.3
business 7.3
group 7.2
decoration 7.2
art 7.2
home 7.2
hairdresser 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 83.3
dance 77.9
footwear 73.8
black and white 72.5
clothing 69.2
umbrella 57.6

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 97.8%
Sad 46.2%
Surprised 18.5%
Happy 9.5%
Calm 8.6%
Angry 5.5%
Disgusted 4.6%
Confused 3.9%
Fear 3.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.6%
Shoe 97.8%
Helmet 94.4%

Captions

Microsoft

a person holding an umbrella 56.9%
a person holding an umbrella in front of a window 41.9%
a person holding a umbrella 41.8%

Text analysis

Amazon

4
5.

Google

MJIA- -YT3RA°2--XAa 5.
MJIA-
5.
-YT3RA°2--XAa