Human Generated Data

Title

Untitled (two men in cart and woman walking down road, Nazaré, Portugal)

Date

1967

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.563.1

Human Generated Data

Title

Untitled (two men in cart and woman walking down road, Nazaré, Portugal)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.563.1

Machine Generated Data

Tags

Amazon
created on 2019-08-09

Machine 99.6
Wheel 99.6
Person 98.4
Human 98.4
Person 98.4
Vehicle 90.6
Transportation 90.6
Person 79.8
Apparel 79.7
Clothing 79.7
Wagon 75.6
Spoke 73.9
Bicycle 73.1
Bike 73.1
Horse Cart 70.1
Wheel 62.2
Carriage 59.4
Female 56.1

Clarifai
created on 2019-08-09

people 99.7
adult 97.7
two 97.6
man 97.1
monochrome 96.2
vehicle 94.6
group together 93.6
woman 93.4
one 93.3
transportation system 92
group 89.1
three 84.5
child 80.8
wedding 77.6
wear 76.8
sepia 76.3
cavalry 73.3
military 71.1
veil 70
canine 69.3

Imagga
created on 2019-08-09

wheeled vehicle 70.4
tricycle 55.3
vehicle 53.7
barrow 39.7
handcart 31.9
conveyance 31.4
wheel 23.7
old 23.7
bicycle 23
bike 20.5
support 19.4
transportation 18.8
spoke 18
sunset 17.1
travel 15.5
silhouette 14.9
sport 14.8
transport 13.7
cycle 13.7
device 13.7
sky 13.4
people 13.4
city 13.3
outdoors 12.8
man 12.8
ride 12.7
carriage 11.8
summer 11.6
lifestyle 11.6
wheelchair 11.4
antique 11.2
street 11
mountain 10.7
retro 10.6
cart 10.5
horse 10.4
landscape 10.4
black 10.3
outdoor 9.9
vintage 9.9
recreation 9.9
vacation 9.8
wheels 9.8
sun 9.7
sea 9.4
cycling 8.9
riding 8.8
water 8.7
ancient 8.6
holiday 8.6
male 8.5
sunrise 8.4
health 8.3
ocean 8.3
cyclist 8.2
exercise 8.2
aged 8.1
active 8.1
metal 8
building 7.9
biking 7.9
biker 7.9
urban 7.9
adventure 7.6
speed 7.3
history 7.2
architecture 7
country 7

Google
created on 2019-08-09

Microsoft
created on 2019-08-09

text 99.5
black and white 83.4
wheel 65.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-34
Gender Female, 50%
Fear 51.8%
Confused 45%
Calm 45.1%
Sad 47.3%
Disgusted 45.1%
Angry 45.4%
Surprised 45.1%
Happy 45.1%

Feature analysis

Amazon

Wheel 99.6%
Person 98.4%
Bicycle 73.1%

Categories

Imagga

paintings art 75.8%
pets animals 19.8%
nature landscape 2.8%

Captions

Text analysis

Amazon

J7