Human Generated Data

Title

Untitled (man with Pearl beer cases on horse cart in front of "Beer Tree")

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2648

Human Generated Data

Title

Untitled (man with Pearl beer cases on horse cart in front of "Beer Tree")

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Machine 94.2
Wheel 94.2
Horse 91.8
Animal 91.8
Mammal 91.8
Transportation 90.9
Vehicle 90.9
Wheel 89.7
Wheel 86.1
Bicycle 82.5
Bike 82.5
Human 79.9
Person 79.9
Person 70
Wagon 67.4
Horse Cart 67.4
Furniture 65.6
Spoke 64.6
Road 64.5
Couch 62.8
People 58.2
Asphalt 56.4
Tarmac 56.4
Person 56.1
Town 55.7
Street 55.7
Building 55.7
City 55.7
Urban 55.7
Carriage 55.2
Buggy 55.1
Wheel 52.2

Imagga
created on 2022-01-15

structure 32.8
billboard 30.9
sky 25.6
signboard 25
landscape 20.8
water 20
travel 19
shopping cart 18.6
building 16.8
wheeled vehicle 15.9
handcart 15.4
old 15.3
clouds 15.2
architecture 15
city 14.1
light 14
sunset 13.5
sea 13.3
black 13.2
tourism 13.2
night 12.4
cityscape 12.3
urban 12.2
cloud 12.1
dark 11.7
grunge 11.1
protection 10.9
horizon 10.8
tower 10.7
skyline 10.5
scene 10.4
evening 10.3
ocean 10.1
landmark 9.9
scenery 9.9
vacation 9.8
river 9.8
scenic 9.7
summer 9.6
dusk 9.5
tree 9.5
beach 9.5
bay 9.4
truck 9.4
jigsaw puzzle 9.4
sunrise 9.4
holiday 9.3
lake 9.2
environment 9.1
stone 8.8
rural 8.8
destruction 8.8
steam 8.7
bridge 8.6
buildings 8.5
castle 8.5
smoke 8.4
texture 8.3
silhouette 8.3
vintage 8.3
island 8.2
dirty 8.1
coast 8.1
sand 8.1
history 8.1
nuclear 7.8
fog 7.7
gas 7.7
downtown 7.7
industry 7.7
power 7.6
famous 7.4
boat 7.4
puzzle 7.4
symbol 7.4
man 7.4
container 7.4
fence 7.4
wall 7.3
motor vehicle 7.3
industrial 7.3
trees 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.7
black and white 95
outdoor 90.9
monochrome 84.6
black 66.9
old 64.5
street 64.2
tree 54.8
drawn 41.4

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 100%
Calm 93.1%
Confused 2.9%
Sad 2.8%
Disgusted 0.5%
Happy 0.3%
Surprised 0.2%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 19-27
Gender Male, 98.9%
Calm 53.4%
Sad 42.3%
Angry 1.2%
Confused 0.9%
Disgusted 0.7%
Fear 0.6%
Happy 0.5%
Surprised 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Wheel 94.2%
Horse 91.8%
Bicycle 82.5%
Person 79.9%

Captions

Microsoft

a vintage photo of a horse drawn carriage in front of a building 58.9%
a vintage photo of a horse drawn carriage 58.8%
a vintage photo of a horse 58.7%

Text analysis

Amazon

Pearl
BILL
SCHROEDER
BILL SCHROEDER Prop.
Prop.
DISTRIBUTOR
Southern, Pearl
Southern,
GAS

Google

YT37A2-XAGON
YT37A2-XAGON