Human Generated Data

Title

Untitled (backyard event with runway)

Date

September 7, 1952

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17999

Human Generated Data

Title

Untitled (backyard event with runway)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

September 7, 1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17999

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Person 99.6
Human 99.6
Person 99.4
Person 98.7
Outdoors 98
Nature 96.4
Person 95.1
Person 94.9
Shelter 93.2
Countryside 93.2
Building 93.2
Rural 93.2
Grass 91.4
Plant 91.4
Clothing 90.5
Apparel 90.5
Person 89.3
Meal 88.7
Food 88.7
Person 87.3
Person 83.2
Person 79.5
Person 78
Face 77.5
Tree 76.3
Person 76.2
Person 74.2
Crowd 73
People 72.7
Vegetation 72.6
Female 70.9
Park 67
Lawn 67
Shack 64.5
Hut 64.5
Person 64.4
Chair 63.8
Furniture 63.8
Yard 63.2
Photography 61.1
Photo 61.1
Girl 60.3
Leisure Activities 60.3
Shorts 59.8
Woman 57.3
Housing 56.6
Picnic 56.6
Vacation 56.6
Sitting 55.6
Person 51.7
Person 50.8

Clarifai
created on 2023-10-29

people 99.8
many 97.6
adult 97.5
group together 95.8
group 94.1
man 94.1
woman 92.9
military 92.6
one 92.6
war 88.7
street 88.2
vehicle 85.6
child 85.4
monochrome 84.8
leader 82.2
crowd 81.4
wear 79.3
administration 78.1
two 74.3
art 72.6

Imagga
created on 2022-03-04

old 23
classroom 23
piano 22.5
stringed instrument 21.8
grand piano 20
musical instrument 19.7
room 19.6
percussion instrument 17.2
vintage 16.5
keyboard instrument 16.3
grunge 15.3
water 14.7
black 14.4
landscape 14.1
sky 12.7
boat 12.2
antique 12.1
dirty 11.7
silhouette 11.6
retro 11.5
travel 11.3
space 10.9
dark 10.9
working 10.6
art 10.5
people 10
light 10
texture 9.7
style 8.9
night 8.9
man 8.7
wall 8.7
grungy 8.5
park 8.3
lake 8.2
building 8.2
sunset 8.1
mountain 8
architecture 7.9
person 7.9
outdoor 7.6
old fashioned 7.6
smoke 7.4
natural 7.4
business 7.3
computer 7.2
music 7.2

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

text 96.5
black and white 94.2
grave 74.2
cemetery 62
monochrome 61.4
man 58.1
person 57.7
funeral 50.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 4-12
Gender Male, 98.4%
Calm 90.9%
Disgusted 5%
Sad 1.1%
Surprised 0.8%
Happy 0.7%
Angry 0.7%
Fear 0.5%
Confused 0.3%

AWS Rekognition

Age 24-34
Gender Female, 54.5%
Calm 30%
Happy 24.8%
Sad 21.5%
Surprised 12.6%
Fear 4.6%
Confused 3%
Disgusted 2.4%
Angry 1.1%

AWS Rekognition

Age 6-16
Gender Female, 86.4%
Sad 39.8%
Calm 15.5%
Angry 11.2%
Disgusted 9.7%
Surprised 9.2%
Happy 5.5%
Fear 4.8%
Confused 4.2%

AWS Rekognition

Age 47-53
Gender Male, 86.8%
Calm 72.4%
Sad 19.1%
Happy 2.9%
Confused 2.2%
Fear 1.1%
Disgusted 1%
Angry 0.7%
Surprised 0.6%

AWS Rekognition

Age 20-28
Gender Male, 85.2%
Calm 99%
Happy 0.5%
Confused 0.1%
Sad 0.1%
Disgusted 0.1%
Surprised 0.1%
Angry 0.1%
Fear 0%

Feature analysis

Amazon

Person
Person 99.6%
Person 99.4%
Person 98.7%
Person 95.1%
Person 94.9%
Person 89.3%
Person 87.3%
Person 83.2%
Person 79.5%
Person 78%
Person 76.2%
Person 74.2%
Person 64.4%
Person 51.7%
Person 50.8%

Text analysis

Amazon

san
KODAK-S.VEEIX