Human Generated Data

Title

Untitled (Papeete, Tahiti)

Date

January 14, 1960-January 22, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5008

Human Generated Data

Title

Untitled (Papeete, Tahiti)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 14, 1960-January 22, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5008

Machine Generated Data

Tags

Amazon
created on 2023-10-05

City 98.8
Road 98.8
Street 98.8
Urban 98.8
Person 98.6
Person 98.5
Adult 98.5
Male 98.5
Man 98.5
Person 98.4
Child 98.4
Female 98.4
Girl 98.4
Person 98.3
Person 98
Person 98
Person 96.9
Person 95.6
License Plate 95.4
Transportation 95.4
Vehicle 95.4
Machine 95.1
Wheel 95.1
Person 94.7
Wheel 93.2
Wheel 91.7
Wheel 91.4
Neighborhood 90.9
Bicycle 90.9
Person 86.1
Car 86
Wheel 81.7
Bicycle 80.9
Motorcycle 75.3
Person 74.8
Person 73.4
Wheel 72.3
Head 71.1
Indoors 70
Restaurant 70
Wheel 69.5
Person 67.3
Wheel 66.4
Wheel 63.1
Face 62.7
Person 60.4
Clothing 58.2
Footwear 58.2
Shoe 58.2
Market 57.7
Bazaar 57.5
Shop 57.5
Shoe 55.8
Antique Car 55.6
Architecture 55.4
Building 55.4
Outdoors 55.4
Shelter 55.4
Factory 55.2
Spoke 55

Clarifai
created on 2018-05-10

people 99.9
group together 99.4
vehicle 99.1
transportation system 98.8
group 98.5
adult 98.3
street 97.9
monochrome 95.5
man 94.5
many 94.4
car 88.4
road 88.4
several 86
crowd 86
war 82.8
military 81
horizontal plane 80.5
driver 78.4
stock 77
military vehicle 74.8

Imagga
created on 2023-10-05

vehicle 27.9
transportation 22.4
truck 21.1
car 20.5
street 18.4
travel 16.9
building 16.8
architecture 16.7
road 16.3
machine 15.4
sky 15.3
city 15
transport 14.6
construction 14.5
wheel 14.5
motor vehicle 14.3
urban 13.1
old 12.5
house 12.5
structure 12
industry 11.1
industrial 10.9
wheeled vehicle 10.6
center 10.2
town 10.2
danger 10
tire 9.9
destruction 9.8
war 9.6
people 9.5
buildings 9.4
day 9.4
tourism 9.1
equipment 8.7
sand 8.7
military 8.7
dirt 8.6
drive 8.5
stone 8.5
power 8.4
landscape 8.2
uniform 8.1
tree 8
rural 7.9
work 7.8
village 7.8
rock 7.8
army 7.8
scene 7.8
machinery 7.7
heavy 7.6
outdoors 7.6
site 7.5
weapon 7.5
stall 7.5
environment 7.4
vacation 7.4
historic 7.3
wreck 7.1
grass 7.1
working 7.1
wooden 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 99.8
sky 98.7
bus 13.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 25-35
Gender Female, 68.9%
Calm 83.5%
Sad 11.2%
Surprised 6.5%
Fear 6.1%
Confused 1.2%
Happy 1.2%
Disgusted 0.5%
Angry 0.4%

AWS Rekognition

Age 22-30
Gender Male, 99%
Calm 60.2%
Happy 28.6%
Surprised 7%
Fear 6.1%
Sad 3.3%
Disgusted 2.6%
Confused 1.8%
Angry 1.7%

AWS Rekognition

Age 4-10
Gender Female, 55.9%
Sad 99.7%
Fear 18.1%
Surprised 6.8%
Calm 4.7%
Confused 3.8%
Happy 2.5%
Disgusted 1.5%
Angry 0.8%

AWS Rekognition

Age 18-26
Gender Male, 94.2%
Calm 63%
Happy 10%
Sad 8.3%
Surprised 8.2%
Confused 6.4%
Fear 6.4%
Disgusted 3.2%
Angry 2.7%

AWS Rekognition

Age 7-17
Gender Female, 55.5%
Fear 86.5%
Surprised 12.9%
Calm 8.2%
Sad 6%
Happy 3%
Disgusted 1.5%
Angry 1.3%
Confused 0.3%

AWS Rekognition

Age 21-29
Gender Male, 92.1%
Sad 99.9%
Calm 14.7%
Surprised 6.5%
Fear 6.1%
Confused 1.8%
Disgusted 1.8%
Happy 1.5%
Angry 0.8%

AWS Rekognition

Age 16-24
Gender Male, 74.6%
Surprised 67.2%
Calm 39.7%
Fear 7.4%
Disgusted 5%
Sad 4.1%
Angry 3.1%
Confused 2.9%
Happy 1.2%

Feature analysis

Amazon

Person 98.6%
Adult 98.5%
Male 98.5%
Man 98.5%
Child 98.4%
Female 98.4%
Girl 98.4%
Wheel 95.1%
Bicycle 90.9%
Car 86%
Motorcycle 75.3%
Shoe 58.2%

Categories

Text analysis

Amazon

YEE
YEE YICK
YICK
RAISON
1073

Google

YEE YICK
YEE
YICK