Human Generated Data

Title

Untitled (three girls standing on porch with arms around each other)

Date

1938

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2085

Human Generated Data

Title

Untitled (three girls standing on porch with arms around each other)

People

Artist: Hamblin Studio, American active 1930s

Date

1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2085

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Clothing 99.9
Apparel 99.9
Person 99.8
Human 99.8
Person 99.7
Person 99.2
Shorts 97.5
Shoe 97.4
Footwear 97.4
Home Decor 96.9
Shoe 95.8
Shoe 91.5
Helmet 87.5
Building 83.4
Flooring 81.3
Female 79.3
Window 78.6
Shoe 70.1
Floor 70
Path 68.8
Dress 67.2
Girl 67
Shoe 65.9
Curtain 65.5
Face 65.1
Photo 65.1
Portrait 65.1
Photography 65.1
Kid 60.4
Child 60.4
Woman 57.2
Boardwalk 56.6
Bridge 56.6
Chair 56.4
Furniture 56.4
Porch 55.8
Nature 55.8
Rural 55.8
Shelter 55.8
Outdoors 55.8
Countryside 55.8

Clarifai
created on 2023-10-15

people 99.9
two 98.8
group 97.3
adult 97
woman 96.3
administration 95.9
wear 94.6
three 93.4
family 92.5
street 92
man 91.6
portrait 91.5
child 90.1
leader 89
group together 88.3
four 87.8
monochrome 87.4
actress 86.1
offspring 84.6
sibling 84.6

Imagga
created on 2021-12-14

people 35.7
person 29.1
adult 28.4
women 22.1
portrait 21.3
fashion 20.3
lifestyle 19.5
happy 19.4
man 18.9
city 18.3
casual 17.8
attractive 16.8
walking 16.1
shop 15.9
smile 15.7
standing 15.6
shopping 15.6
lady 15.4
pretty 15.4
male 15
happiness 14.9
urban 14
smiling 13.7
hospital 13.3
negative 13
street 12.9
business 12.8
human 12.7
mall 12.7
modern 12.6
bag 12.4
holding 12.4
life 12.3
buy 12.2
style 11.9
interior 11.5
customer 11.4
retail 11.4
store 11.3
one 11.2
men 11.2
professional 10.9
family 10.7
film 10.2
cute 10
building 10
dress 9.9
mother 9.8
old 9.7
group 9.7
looking 9.6
couple 9.6
model 9.3
clothing 9.3
sale 9.2
girls 9.1
posing 8.9
consumerism 8.8
indoors 8.8
bags 8.8
travel 8.4
black 8.4
room 8.4
leisure 8.3
alone 8.2
art 8.2
cheerful 8.1
daughter 7.9
photographic paper 7.9
day 7.8
face 7.8
color 7.8
full length 7.8
corporate 7.7
blurred 7.7
walk 7.6
child 7.6
elegance 7.6
indoor 7.3
businesswoman 7.3
sexy 7.2
activity 7.2

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

window 98.8
clothing 93.3
person 92.9
text 92.5
footwear 91.8
standing 79.2
dress 72.5
black and white 71.5
woman 69.7
street 68.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 34-50
Gender Female, 96.4%
Happy 97.5%
Calm 2.2%
Sad 0.1%
Surprised 0.1%
Confused 0%
Angry 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 22-34
Gender Female, 79.2%
Happy 47.6%
Angry 21.1%
Calm 11.8%
Surprised 10.3%
Fear 4.7%
Sad 2.1%
Disgusted 1.4%
Confused 1%

AWS Rekognition

Age 22-34
Gender Female, 59.8%
Calm 73.3%
Happy 21.4%
Sad 2.6%
Surprised 1.3%
Angry 0.9%
Confused 0.3%
Disgusted 0.1%
Fear 0.1%

Feature analysis

Amazon

Person 99.8%
Shoe 97.4%
Helmet 87.5%

Categories