Human Generated Data

Title

Untitled (boat show display)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16818

Human Generated Data

Title

Untitled (boat show display)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Human 99.5
Person 99.5
Person 99.4
Person 99
Person 98.4
Person 91.5
Person 91.4
Boat 89.3
Vehicle 89.3
Transportation 89.3
Aircraft 74.3
Airplane 74.3
Wheel 68.6
Machine 68.6
Watercraft 67.4
Vessel 67.4

Imagga
created on 2022-02-26

ship 44
sea 35.2
boat 35
vessel 29.4
port 28.9
water 28
harbor 26.9
transport 25.6
device 24.9
transportation 24.2
ocean 23.3
travel 21.1
warship 19.7
yacht 19.2
marina 18.5
industrial 18.1
dock 17.5
industry 17.1
sky 16.6
cruise 16.5
nautical 16.5
tourism 16.5
luxury 16.3
pier 15.6
vacation 15.5
airplane 14.9
vehicle 14.6
sailing 14.6
torpedo 14.1
craft 13.8
boats 13.6
military vehicle 13.4
aircraft 13.3
ships 12.8
bay 12.4
building 12.1
power 11.7
cargo 11.6
aircraft carrier 11.4
beach 11
moored 10.8
yachts 10.8
sail 10.7
plane 10.6
military 10.6
marine 10.4
technology 10.4
business 10.3
summer 10.3
battleship 10.2
air 10.1
coast 9.9
docked 9.9
war 9.9
river 9.8
old 9.7
airliner 9.6
jet 9.4
holiday 9.3
leisure 9.1
machine 9.1
tourist 9.1
iron lung 9.1
sailboat 9
passenger 9
airport 8.9
explosive device 8.9
liner 8.8
engine 8.7
city 8.3
space shuttle 8.1
steel 7.9
architecture 7.9
navy 7.9
aviation 7.8
freight 7.8
flight 7.7
platform 7.6
equipment 7.6
journey 7.5
outdoors 7.5
vintage 7.4
oil 7.4
speed 7.3
reflection 7.3
respirator 7.2
tank 7.2
sunset 7.2
wealth 7.2
musical instrument 7

Microsoft
created on 2022-02-26

text 99.8
black and white 94.2
piano 70.2
black 65.4
white 61.1
vehicle 60
monochrome 53.6

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Male, 79.9%
Sad 97.5%
Confused 0.8%
Calm 0.7%
Fear 0.4%
Angry 0.4%
Disgusted 0.1%
Surprised 0.1%
Happy 0.1%

AWS Rekognition

Age 22-30
Gender Male, 96.9%
Calm 68.1%
Sad 26.6%
Happy 2%
Confused 1.2%
Disgusted 1.1%
Surprised 0.4%
Fear 0.4%
Angry 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Boat 89.3%
Airplane 74.3%

Captions

Microsoft

a group of people on a boat 53%
a group of people standing in front of a boat 52.9%
a group of people in a boat 52.8%

Text analysis

Amazon

Chris
Chris Craft
Craft
SHEPARD
SHEPARD CRUISER
CRUISER
JOHNSON
TELEPHONE
WAYZA
09
TELEPHONE MAYZATA 09
Christ
WAYZA O
MAYZATA
Traft
SON
will
O
.....
OHNSON
KODAK-SEEL
محمد

Google

SHOW
ROO
WORNS
WA
A
ELEHONE
WETOATR
ONI
BAT
SHEPARD
ELEHONE WETOATR CR EPNS SHOW ROO ONI BAT WORNS INC WA Chris Craft SHEPARD CRUISER NSON HNSON YT33A2- A
CR
EPNS
INC
Chris
CRUISER
Craft
NSON
HNSON
YT33A2-