Human Generated Data

Title

Untitled (children next to Liberty Bell during transport)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2468

Human Generated Data

Title

Untitled (children next to Liberty Bell during transport)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2468

Machine Generated Data

Tags

Amazon
created on 2019-06-17

Human 99.7
Person 99.7
Person 99.7
Person 99.6
Shorts 99.6
Apparel 99.6
Clothing 99.6
Person 99.1
Person 98.7
Person 98.7
Person 98.5
Person 98.4
Person 98.4
Person 98
Person 96.3
Person 90
Nature 89.4
Shelter 89.4
Countryside 89.4
Outdoors 89.4
Rural 89.4
Building 89.4
Person 88.7
Dress 88.2
Person 86.3
Female 85.2
Pedestrian 81.1
Transportation 78.5
Vehicle 77.9
Kid 76.9
Child 76.9
Crowd 76.2
Face 75.6
People 75.1
Path 73.8
Urban 72.8
Aircraft 71.4
Helicopter 71.4
City 70.6
Town 70.6
Person 70.4
Road 69.5
Street 69.5
Person 69
Water 67.6
Housing 66.8
Photography 65.6
Photo 65.6
Portrait 65.6
Tree 64.8
Plant 64.8
Woman 64.7
Girl 63.4
Waterfront 58.4
Pier 58.4
Dock 58.4
Port 58.4
Architecture 58
House 56.4
Villa 56.4
Downtown 56.4
Play 55.5
Coat 55
Person 53.4

Clarifai
created on 2019-06-17

people 100
group together 99.7
many 99.5
group 99.3
vehicle 98.9
adult 95.9
child 95.4
man 93.9
transportation system 93.8
crowd 91.7
military 86.3
several 86.3
administration 84.7
woman 84.2
war 83.7
wear 82.9
outfit 82.5
street 82.1
aircraft 81.1
spectator 74.1

Imagga
created on 2019-06-17

world 43.2
city 30.8
architecture 27.3
garbage truck 25.5
travel 23.2
history 21.5
truck 20.4
building 19.7
tourism 19
old 18.8
street 16.6
motor vehicle 15.8
urban 15.7
landmark 15.3
sky 15.3
people 15.1
water 14.7
palace 14.4
town 13.9
statue 13.5
wheeled vehicle 13.4
river 13.3
tourist 13.3
stone 12.7
sculpture 12.4
cityscape 12.3
structure 12.2
house 12.2
monument 12.1
culture 12
historic 11.9
transportation 11.7
capital 11.4
stall 11.2
tower 10.7
bridge 10.4
ancient 10.4
historical 10.3
famous 10.2
wagon 10.2
sea 10.2
religion 9.9
port 9.6
walking 9.5
buildings 9.5
traditional 9.1
business 9.1
summer 9
vacation 9
center 8.9
scene 8.7
men 8.6
pedestrian 8.5
fountain 8.5
church 8.3
coast 8.1
black 7.8
marble 7.7
construction 7.7
crowd 7.7
power 7.6
ocean 7.5
transport 7.3
group 7.3
vehicle 7.2
holiday 7.2
night 7.1
flag 7.1
day 7.1

Google
created on 2019-06-17

Microsoft
created on 2019-06-17

outdoor 87.1
black and white 85.4
clothing 84
person 82.2
ship 81.5
old 44
posing 36.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-38
Gender Female, 54.7%
Disgusted 46.1%
Calm 47%
Sad 46.2%
Surprised 45.3%
Confused 45.3%
Happy 49.4%
Angry 45.7%

AWS Rekognition

Age 38-57
Gender Female, 51.7%
Calm 45.8%
Sad 45.5%
Disgusted 47.1%
Surprised 45.6%
Angry 46.2%
Confused 45.4%
Happy 49.4%

AWS Rekognition

Age 35-53
Gender Male, 52.9%
Angry 47.9%
Sad 45.5%
Disgusted 46.2%
Confused 45.8%
Happy 46.3%
Calm 46.8%
Surprised 46.4%

AWS Rekognition

Age 26-43
Gender Male, 53.2%
Disgusted 48.8%
Confused 45.5%
Sad 45.6%
Surprised 45.6%
Happy 45.7%
Calm 46.9%
Angry 47%

AWS Rekognition

Age 26-43
Gender Female, 52.4%
Sad 46.6%
Angry 47.4%
Disgusted 46.7%
Surprised 45.8%
Confused 45.4%
Happy 46.1%
Calm 47.1%

AWS Rekognition

Age 26-43
Gender Female, 54.2%
Calm 49.4%
Angry 45.4%
Happy 45.7%
Sad 48%
Confused 45.5%
Surprised 45.6%
Disgusted 45.3%

AWS Rekognition

Age 35-55
Gender Female, 50.5%
Sad 49.6%
Happy 49.8%
Angry 49.6%
Confused 49.6%
Calm 49.8%
Surprised 49.6%
Disgusted 49.6%

Feature analysis

Amazon

Person 99.7%
Helicopter 71.4%

Categories

Text analysis

Amazon

2EELA
uddliacllg