Human Generated Data

Title

Untitled (old woman, old man and boy with umbrella with mule-drawn cart, Nazaré, Portugal)

Date

1967

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.552.5

Human Generated Data

Title

Untitled (old woman, old man and boy with umbrella with mule-drawn cart, Nazaré, Portugal)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.552.5

Machine Generated Data

Tags

Amazon
created on 2019-08-09

Human 97.4
Person 97.4
Clothing 96.9
Apparel 96.9
Person 96.3
Person 89
Spoke 82.6
Machine 82.6
Vehicle 77.7
Transportation 77.7
Coat 75.6
Canopy 63.2
Tire 63.2
Sleeve 62
Overcoat 60.4

Clarifai
created on 2019-08-09

people 99.9
adult 98.8
two 98.6
man 96.9
child 96.5
group 96.1
woman 95.4
vehicle 93.1
three 92.8
sit 88.8
four 86.2
group together 86
transportation system 85.8
furniture 85.1
one 83.7
carriage 81.2
family 79.1
wear 78.6
military 78.4
lid 78.4

Imagga
created on 2019-08-09

architecture 30.3
cannon 27.2
building 26.4
sky 24.2
travel 21.8
vehicle 21.8
old 19.5
iron lung 18.3
city 18.3
roof 18.1
device 18
history 17
machine 16.9
gun 15.8
dome 14.9
respirator 14.7
tank 13.6
structure 13.5
water 13.3
construction 12.8
military vehicle 12.1
clouds 11.8
tracked vehicle 11.7
tourism 11.5
tractor 11.4
landscape 11.2
breathing device 11
weapon 10.9
vacation 10.6
house 10.2
high-angle gun 10.1
vintage 9.9
hut 9.6
artillery 9.5
wheel 9.4
monument 9.3
wheeled vehicle 9.3
church 9.2
outdoor 9.2
landmark 9
tower 8.9
farm 8.9
snow 8.8
work 8.6
culture 8.5
historical 8.5
equipment 8.4
fountain 8.4
industrial 8.2
stone 8.2
rural 7.9
conveyance 7.9
center 7.9
armored vehicle 7.8
statue 7.8
ancient 7.8
war 7.7
tree 7.7
winter 7.7
sculpture 7.6
field 7.5
famous 7.4
tourist 7.4
town 7.4
bulldozer 7.4
historic 7.3
shelter 7.3
religion 7.2
holiday 7.2
scenic 7

Google
created on 2019-08-09

Microsoft
created on 2019-08-09

text 98.5
wheel 95.7
land vehicle 91.1
person 90.1
vehicle 80.4
tire 72.5
black and white 64.2
old 40.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-42
Gender Male, 98.5%
Disgusted 0.1%
Angry 0.1%
Sad 98.4%
Happy 0%
Confused 0.1%
Fear 0.7%
Surprised 0%
Calm 0.5%

AWS Rekognition

Age 37-55
Gender Male, 67.8%
Fear 4%
Sad 48.8%
Angry 2.6%
Happy 4.5%
Calm 35.6%
Surprised 2.5%
Disgusted 0.5%
Confused 1.5%

AWS Rekognition

Age 31-47
Gender Female, 54.9%
Confused 1.4%
Sad 61%
Angry 21.7%
Happy 6.3%
Calm 4.7%
Surprised 1%
Disgusted 1.6%
Fear 2.4%

Feature analysis

Amazon

Person 97.4%

Categories

Imagga

paintings art 87.7%
nature landscape 11.6%