Human Generated Data

Title

Untitled (adults watching children play on floor)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17731

Human Generated Data

Title

Untitled (adults watching children play on floor)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17731

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Shoe 99.8
Footwear 99.8
Clothing 99.8
Apparel 99.8
Shoe 99.7
Person 99.1
Human 99.1
Person 97.5
Person 94.4
Collage 83
Poster 83
Advertisement 83
Face 82.9
Costume 81.9
Female 78.2
Dress 75.8
Mammal 73.7
Animal 73.7
Pet 71.4
Canine 71.3
Girl 69.7
Photography 60.7
Photo 60.7
Floor 58.3
Shorts 57.1

Clarifai
created on 2023-10-29

dog 99.9
people 99.9
canine 99.9
monochrome 98.8
adult 98.5
group 97.9
wear 97
two 97
man 96.7
mammal 93.8
humor 92.9
group together 92.2
movie 91.4
actor 91.1
pet 90.1
nostalgia 89.5
boxer 89.3
puppy 88.7
veil 88
sit 86.5

Imagga
created on 2022-02-26

newspaper 31.9
product 25.4
person 24.4
creation 19.9
people 17.3
man 15.4
dress 15.3
adult 14
planner 14
celebration 13.5
negative 12.3
male 12
women 11.1
glass 11
wedding 11
lifestyle 10.8
fashion 10.5
drawing 10.4
party 10.3
decoration 10.3
sport 10.2
day 10.2
film 9.8
human 9.7
portrait 9.7
bride 9.6
bouquet 8.8
medical 8.8
happy 8.8
urban 8.7
groom 8.7
love 8.7
luxury 8.6
costume 8.5
health 8.3
technology 8.2
team 8.1
romantic 8
sketch 7.9
hair 7.9
wed 7.9
flowers 7.8
face 7.8
gown 7.8
color 7.8
gift 7.7
men 7.7
elegance 7.6
city 7.5
outdoors 7.5
one 7.5
holding 7.4
care 7.4
lady 7.3
clothing 7.3
chair 7.3
black 7.2
suit 7.2
body 7.2
science 7.1
work 7.1
medicine 7
indoors 7
modern 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 99.5
dog 95.5
drawing 85.7
clothing 70.3
sketch 67.6
person 64.9
posing 40.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 10-18
Gender Female, 76.8%
Calm 99.2%
Sad 0.3%
Surprised 0.2%
Disgusted 0.1%
Fear 0.1%
Happy 0.1%
Confused 0%
Angry 0%

AWS Rekognition

Age 40-48
Gender Male, 98%
Calm 30.1%
Happy 26.7%
Confused 11.7%
Sad 9.8%
Surprised 9.4%
Fear 6.3%
Disgusted 4.1%
Angry 1.8%

AWS Rekognition

Age 48-56
Gender Male, 97.6%
Happy 99.8%
Confused 0.1%
Surprised 0.1%
Calm 0.1%
Angry 0%
Disgusted 0%
Fear 0%
Sad 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Shoe
Person
Shoe 99.8%
Shoe 99.7%
Person 99.1%
Person 97.5%
Person 94.4%

Categories

Imagga

paintings art 98.5%

Text analysis

Amazon

14
ОГД
KODVK-AVEELA