Human Generated Data

Title

Photo Album

Date

c. 1857 - c. 1874

People

-

Classification

Photographs

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from Widener Library, Harvard University, 1978.484.73

Human Generated Data

Title

Photo Album

Date

c. 1857 - c. 1874

Classification

Photographs

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from Widener Library, Harvard University, 1978.484.73

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 98.4
Human 98.4
Boat 93.8
Transportation 93.8
Vehicle 93.8
Wood 87.4
Clothing 71.7
Apparel 71.7
Machine 69.1
Spoke 59.3
Carpenter 55.8

Clarifai
created on 2023-10-25

people 99.9
adult 98.9
vehicle 98.7
one 98.3
two 98.2
watercraft 98.2
group together 97.1
group 96.9
transportation system 96.4
man 93.9
three 92.8
raw material 90.7
biplane 88.4
four 87.2
many 87.2
aircraft 84.6
wear 84
military 83.3
construction worker 81.8
war 78.8

Imagga
created on 2022-01-09

loom 55.5
machine 48.2
device 47.8
textile machine 46.1
industry 37.6
sky 29.3
industrial 29
crane 26.5
steel 25.2
construction 24.8
catapult 24.5
sea 24.2
metal 22.5
water 22
port 21.2
instrument 20.4
cargo 19.4
structure 19.3
engine 19
equipment 18.6
bridge 17.8
dock 17.5
architecture 17.2
transportation 17
power 16.8
harbor 16.4
building 15.9
business 15.8
crosspiece 15.6
lift 14.6
transport 14.6
ocean 14.1
container 14
shipping 13.7
brace 13.5
iron 13.1
freight 12.7
tower 12.5
city 12.5
logistics 11.8
skeleton 11.7
river 11.6
trade 11.5
heavy 11.4
engineering 11.4
work 11.2
old 11.1
ship 10.9
wharf 10.8
travel 10.6
high 10.4
commerce 10.3
energy 10.1
cranes 9.9
carrier 9.8
export 9.8
cable 9.7
commercial 9.4
strengthener 9.4
boat 9.3
loading 8.9
load 8.8
gas 8.7
guillotine 8.5
instrument of execution 8.3
historic 8.2
vessel 8.1
quay 7.9
weight 7.8
international 7.6
skyline 7.6
house 7.5
wood 7.5
silhouette 7.5
landscape 7.4
oil 7.4
landmark 7.2
coast 7.2
structural member 7.1
summer 7.1
country 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

ship 99.2
outdoor 93.4
text 90.1
watercraft 84.9
old 84.7
boat 83
transport 69.9
black 67.4
crane 53.4
vintage 35.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Female, 99.9%
Calm 99.5%
Sad 0.3%
Happy 0.1%
Disgusted 0%
Confused 0%
Angry 0%
Surprised 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.4%
Boat 93.8%

Categories