Human Generated Data

Title

Untitled (Papeete, Tahiti)

Date

January 14, 1960-January 22, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5009

Human Generated Data

Title

Untitled (Papeete, Tahiti)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 14, 1960-January 22, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5009

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Person 99.8
Human 99.8
Person 99.8
Person 99.5
Person 99.1
Wheel 97.5
Machine 97.5
Person 96.3
Wheel 94.9
Transportation 94.8
Vehicle 94.8
Bicycle 94.8
Bike 94.8
Wheel 92.6
Person 89.6
Bus 85.8
Person 85.3
Person 85.3
Cable Car 77.1
Wheel 76.4
Person 73.6
Person 68.3
Bicycle 63.9
Person 63.9
Person 62.8
Person 62.7
Tram 55.3
Trolley 55.3
Streetcar 55.3
Person 44.3

Clarifai
created on 2018-03-23

people 100
group together 99.8
group 99.7
many 99.7
vehicle 99.1
adult 98.8
transportation system 96.3
several 96
man 95.9
war 95.7
administration 95.3
military 95
soldier 93.9
woman 90.7
leader 89.7
wear 85.6
crowd 79
child 77.4
five 76.4
two 76.4

Imagga
created on 2018-03-23

vehicle 37.3
streetcar 37
wheeled vehicle 35.7
conveyance 35
bus 34
seller 32.2
minibus 24.6
transportation 24.2
passenger 22.3
car 21.5
street 21.2
public transport 19.4
transport 19.2
travel 17.6
tramway 17.4
architecture 17.2
road 17.2
city 16.6
urban 15.7
old 15.3
building 15.1
stall 13.3
wagon 13.3
historic 11.9
tourism 11.5
train 10.7
buildings 10.4
construction 10.3
house 10
danger 10
power 9.2
tourist 8.5
sky 8.3
vacation 8.2
history 8
station 7.8
container 7.8
destruction 7.8
track 7.8
public 7.8
industry 7.7
auto 7.7
wheel 7.5
town 7.4
industrial 7.3
people 7.2
structure 7.2

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

outdoor 99.9
building 99.5
people 88.1
group 60.2
old 56.1

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 26-44
Gender Male, 53.8%
Angry 45.4%
Confused 45.5%
Disgusted 45.5%
Surprised 45.3%
Sad 46.5%
Calm 51.3%
Happy 45.5%

AWS Rekognition

Age 35-52
Gender Female, 53.4%
Happy 45.4%
Confused 45.5%
Sad 47.5%
Angry 45.6%
Calm 50.1%
Disgusted 45.4%
Surprised 45.5%

AWS Rekognition

Age 26-43
Gender Female, 52.7%
Surprised 46%
Happy 45.6%
Disgusted 45.7%
Calm 49.4%
Confused 45.7%
Angry 45.9%
Sad 46.7%

AWS Rekognition

Age 29-45
Gender Male, 50.4%
Happy 49.5%
Confused 49.5%
Sad 49.5%
Angry 50%
Calm 49.5%
Disgusted 49.8%
Surprised 49.5%

AWS Rekognition

Age 15-25
Gender Male, 50%
Calm 49.8%
Disgusted 49.5%
Angry 49.6%
Happy 49.6%
Confused 49.6%
Sad 49.7%
Surprised 49.7%

AWS Rekognition

Age 15-25
Gender Female, 50.5%
Disgusted 50.2%
Happy 49.5%
Surprised 49.5%
Sad 49.5%
Calm 49.6%
Angry 49.5%
Confused 49.5%

Microsoft Cognitive Services

Age 22
Gender Male

Feature analysis

Amazon

Person 99.8%
Wheel 97.5%
Bicycle 94.8%
Bus 85.8%

Categories

Text analysis

Amazon

TOMATO
HOUISTER
7E