Human Generated Data

Title

Untitled (people on boat next to shore)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16749

Human Generated Data

Title

Untitled (people on boat next to shore)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Military 94
Person 92.6
Human 92.6
Military Uniform 90.9
Person 90.1
Boat 85.4
Transportation 85.4
Vehicle 85.4
Soldier 81.5
Army 81.5
Armored 81.5
Person 72.1
Spoke 71.1
Machine 71.1
People 68.8
Tire 60.9
Officer 59.4
Meal 57.2
Food 57.2
Tarmac 55.1
Asphalt 55.1
Wheel 50.4

Imagga
created on 2022-02-26

sky 34.7
road 26.2
landscape 26
wheeled vehicle 25.3
shopping cart 24.5
cloud 23.3
rural 22.9
handcart 19.9
travel 19.7
fence 17.8
transportation 17
clouds 16.9
field 16.7
grass 16.6
highway 16.4
tree 16.3
trailer 16.2
snow 14.9
horizon 14.4
car 14.1
park bench 14.1
bench 14.1
structure 13.8
vehicle 13.5
drive 13.2
country 13.2
scene 13
light 12.7
railing 12.4
conveyance 12.4
scenic 12.3
weather 12.3
countryside 11.9
transport 11.9
mountain 11.7
trees 11.6
environment 11.5
water 11.3
container 11.3
street 11
speed 11
summer 10.9
truck 10.9
farm 10.7
speedway 10.4
cloudy 10.3
winter 10.2
old 9.8
forest 9.6
traffic 9.5
color 9.5
trip 9.4
hill 9.4
outdoor 9.2
outdoors 9
freeway 8.9
driving 8.7
day 8.6
seat 8.6
industry 8.5
building 8.5
fast 8.4
mobile home 8.4
barrier 8.4
house 8.4
vacation 8.2
sunset 8.1
sea 7.8
route 7.8
line 7.8
asphalt 7.8
season 7.8
housing 7.8
cold 7.8
racetrack 7.6
land 7.4
course 7.4
yellow 7.3
scenery 7.2
beach 7.1
picket fence 7.1
to 7.1
plant 7.1

Microsoft
created on 2022-02-26

outdoor 98.7
text 93
black and white 84.3
tree 77.2
white 72.4
sky 56.2
old 44.1

Face analysis

Amazon

AWS Rekognition

Age 21-29
Gender Male, 99.4%
Sad 75.4%
Calm 12.5%
Happy 7.2%
Surprised 1.4%
Fear 1.2%
Disgusted 0.8%
Confused 0.8%
Angry 0.7%

Feature analysis

Amazon

Person 92.6%
Boat 85.4%
Wheel 50.4%

Captions

Microsoft

a vintage photo of a truck 88.7%
a vintage photo of a plane 77.3%
a vintage photo of a person riding on the back of a truck 63.6%

Text analysis

Amazon

34
ALVITA
=
KODAK-

Google

Cai Cieft
Cai
Cieft