Human Generated Data

Title

Untitled (men in courtyard in front of fishing shed, Nazaré, Portugal)

Date

1967

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.529.4

Human Generated Data

Title

Untitled (men in courtyard in front of fishing shed, Nazaré, Portugal)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.529.4

Machine Generated Data

Tags

Amazon
created on 2019-08-09

Human 99.6
Person 99.6
Person 99.4
Person 97.7
Person 96.6
Outdoors 95.2
Person 94.7
Nature 94.1
Person 88.2
Building 84.3
Clothing 81.8
Apparel 81.8
Transportation 72.9
Vehicle 72.9
Boat 72.9
People 70.2
Field 69.8
Countryside 69.4
Person 68.7
Shorts 63.7
Road 62.6
Water 61.5
Soil 60.9
Rural 59.2
Asphalt 57.7
Tarmac 57.7
Crowd 57.2
Shoreline 55.8

Clarifai
created on 2019-08-09

people 99.9
group together 98.8
adult 98.4
one 97.5
two 97.3
group 94.9
man 93.9
four 91.3
vehicle 90.8
military 90.5
child 90.1
three 89.7
wear 89.1
war 89
woman 88.2
several 87.1
five 85.8
recreation 84.3
many 83.8
monochrome 83.5

Imagga
created on 2019-08-09

cricket bat 23.8
cricket equipment 22.4
grass 21.3
sports equipment 18
outdoor 16.8
sky 15.9
rural 15.9
equipment 15.6
man 15.5
field 15.1
farm 14.3
dog 14.1
football helmet 13.1
people 12.8
person 12.6
outdoors 11.9
happiness 11.7
summer 11.6
horse 11.5
structure 11.3
helmet 11.2
landscape 11.2
adult 11.1
tree 10.8
sport 10.8
park 9.9
happy 9.4
canine 9.1
environment 9
active 9
fun 9
meadow 9
trees 8.9
travel 8.4
dark 8.3
action 8.3
animals 8.3
leisure 8.3
freedom 8.2
countryside 8.2
danger 8.2
water 8
truck 8
headdress 8
lifestyle 7.9
autumn 7.9
forest 7.8
fence 7.8
male 7.8
vehicle 7.7
outside 7.7
old 7.7
barn 7.6
clothing 7.5
speed 7.3
sun 7.2
road 7.2
wheeled vehicle 7.2
mountain 7.1
day 7.1

Google
created on 2019-08-09

Microsoft
created on 2019-08-09

outdoor 96.9
black and white 89.9
text 81.9
old 75.8
person 65.8
vintage 49.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 28-44
Gender Male, 52.3%
Sad 47.3%
Confused 50.4%
Surprised 45.2%
Disgusted 45.6%
Angry 45.1%
Happy 45.4%
Fear 45.9%
Calm 45.2%

AWS Rekognition

Age 13-23
Gender Female, 50.3%
Surprised 49.6%
Confused 49.5%
Disgusted 49.5%
Calm 49.9%
Happy 49.6%
Fear 49.6%
Angry 49.5%
Sad 49.7%

Feature analysis

Amazon

Person 99.6%
Boat 72.9%

Categories

Text analysis

Amazon

DOS

Google

AR
PES
AR DOS PES DEUS
DOS
DEUS