Human Generated Data

Title

Untitled (children seated at party outside)

Date

1949

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18303

Human Generated Data

Title

Untitled (children seated at party outside)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18303

Machine Generated Data

Tags

Amazon
created on 2019-07-31

Human 99.2
Person 99.2
Person 99.1
Person 98.1
Person 98
Person 96.2
Clothing 95.1
Apparel 95.1
Person 94.5
Person 89.6
Person 87.5
Person 86.7
Person 84.4
Crowd 83.7
Furniture 83.7
Chair 83.7
Military 83.3
Dress 82.6
Person 81.9
Person 81.8
Nature 80.7
Outdoors 80.5
Military Uniform 80.1
People 78.2
Face 77.8
Person 76.9
Female 73.4
Kid 71.9
Child 71.9
Plant 69.4
Armored 68.7
Army 68.7
Tree 64.9
Photography 64.9
Photo 64.9
Portrait 64.1
Soldier 63.8
Food 62.7
Meal 62.7
Person 61.4
Girl 59.5
Park 57.8
Grass 57.8
Lawn 57.8
Coat 57.6
Suit 57.6
Overcoat 57.6
Fashion 56.8
Gown 56.8
Water 55.3
Person 55.2
Robe 55.1
Person 55

Clarifai
created on 2019-07-31

people 99.9
many 98.9
group 98.2
group together 98.2
war 97.7
adult 97.5
military 97
wear 94.9
soldier 93.1
man 92.8
child 91.7
skirmish 91.6
crowd 91.2
uniform 86.3
leader 84.5
vehicle 83.5
army 81.1
administration 80.8
recreation 80.2
outfit 79.8

Imagga
created on 2019-07-31

stone 22.5
landscape 21.6
travel 21.1
gravestone 18.5
tourism 18.2
sky 16.6
memorial 16.6
sand 15.7
rock 15.6
wall 15.5
structure 15.4
day 14.9
old 14.6
tree 14.3
mountain 14.2
water 14
building 13.9
outdoor 13
vacation 12.3
sea 11.7
park bench 10.8
coast 10.8
park 10.7
scenic 10.5
beach 10.3
animal 10.2
shore 10.2
architecture 10.2
snow 10.1
ocean 10.1
history 9.8
texture 9.7
fence 9.6
bench 9.5
cemetery 9.3
outdoors 9.2
stone wall 9.1
scenery 9
summer 9
forest 8.7
natural 8.7
scene 8.7
tourist 8.6
winter 8.5
bird 8.5
waves 8.4
city 8.3
child 8.3
barrier 8.3
landmark 8.1
colorful 7.9
ancient 7.8
grunge 7.7
town 7.4
environment 7.4
life 7.3
yellow 7.3
gray 7.2
holiday 7.2
sunlight 7.1
trees 7.1
rural 7
seat 7
season 7

Google
created on 2019-07-31

Microsoft
created on 2019-07-31

outdoor 97
person 92.1
text 85.1
black and white 84.5
people 69.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 38-57
Gender Female, 52.2%
Surprised 45.1%
Sad 53.7%
Happy 45.3%
Angry 45.3%
Calm 45.2%
Disgusted 45.1%
Confused 45.2%

AWS Rekognition

Age 26-43
Gender Female, 51.7%
Calm 46.6%
Happy 49.7%
Confused 45.2%
Disgusted 45.3%
Surprised 45.4%
Angry 46.4%
Sad 46.3%

AWS Rekognition

Age 35-52
Gender Female, 54.8%
Sad 45.5%
Calm 50.2%
Angry 45.5%
Surprised 45.7%
Confused 45.3%
Happy 46.5%
Disgusted 46.3%

AWS Rekognition

Age 49-69
Gender Female, 54.2%
Confused 45.6%
Sad 47.5%
Surprised 45.6%
Angry 45.6%
Calm 49.3%
Disgusted 45.5%
Happy 46%

AWS Rekognition

Age 26-43
Gender Female, 51.4%
Disgusted 45.3%
Confused 45.6%
Sad 48.5%
Calm 48%
Angry 45.8%
Happy 46.1%
Surprised 45.7%

AWS Rekognition

Age 26-43
Gender Female, 52.3%
Calm 48.6%
Disgusted 45.1%
Surprised 45.3%
Happy 45.3%
Angry 45.4%
Confused 45.4%
Sad 50%

AWS Rekognition

Age 26-43
Gender Male, 50.8%
Calm 49.3%
Angry 46.1%
Sad 46.5%
Happy 45.7%
Confused 46%
Disgusted 45.5%
Surprised 45.9%

AWS Rekognition

Age 48-68
Gender Female, 54.2%
Happy 45.1%
Disgusted 54%
Confused 45.1%
Angry 45.3%
Sad 45.4%
Surprised 45.1%
Calm 45.1%

AWS Rekognition

Age 23-38
Gender Female, 51.7%
Calm 47.8%
Angry 45.4%
Disgusted 45.2%
Surprised 45.3%
Happy 45.1%
Sad 51%
Confused 45.2%

AWS Rekognition

Age 26-43
Gender Female, 51.2%
Disgusted 45.1%
Calm 46%
Confused 45.2%
Sad 52.9%
Surprised 45.2%
Happy 45.2%
Angry 45.4%

AWS Rekognition

Age 23-38
Gender Female, 52.6%
Surprised 45.5%
Sad 50%
Calm 47.2%
Angry 45.9%
Happy 45.2%
Disgusted 45.4%
Confused 45.9%

Feature analysis

Amazon

Person 99.2%
Chair 83.7%

Text analysis

Amazon

YT33A2
o8
XAGOX

Google

8OA YTヨヨA2-MAGOM
8OA
YT
ヨヨ
A2
-
MAGOM