Human Generated Data

Title

Untitled (group of soldiers beneath crane lifting army vehicle)

Date

1950

People

Artist: Mary Lowber Tiers, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15812

Human Generated Data

Title

Untitled (group of soldiers beneath crane lifting army vehicle)

People

Artist: Mary Lowber Tiers, American active 1940s

Date

1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Human 98.9
Person 98.9
Person 98.7
Person 96.9
Person 92.3
Person 91.1
Person 89.8
Person 84.6
Person 84.4
Transportation 83.3
Boat 83.3
Vehicle 83.3
Nature 80.5
Advertisement 76.5
Outdoors 75.9
Person 69.8
Art 69.5
Poster 68.8
Text 64.5
People 61.5
Weather 60.2
Collage 59.2
Wood 56.9
Silhouette 55.8
Person 47.3
Person 44.6

Imagga
created on 2022-02-05

sailboat 78.1
trimaran 68
sailing vessel 66.1
vessel 53.6
sky 33.3
boat 27.2
craft 26
water 25.4
bridge 23.5
sail 23.3
sea 22.8
catamaran 22.1
river 19.6
sailing 18.5
yacht 17.8
travel 17.6
ocean 17.4
landscape 17.1
sunset 17.1
schooner 16.8
ship 16.6
transport 16.4
architecture 16.4
transportation 16.1
light 16
summer 15.4
sun 15.3
structure 15.2
suspension 14.8
wave 14.7
cable 14.3
modern 14
cloud 13.8
wind 13.1
clouds 12.7
traffic 12.3
beach 12.1
construction 12
building 11.9
calm 11.9
flagpole 11.8
tourism 11.6
design 11.3
lake 11.1
energy 10.9
tower 10.7
cruise 10.7
sunrise 10.3
exterior 10.1
horizon 9.9
reflection 9.9
cables 9.8
wreckage 9.8
urban 9.6
high 9.5
industry 9.4
holiday 9.3
outdoor 9.2
speed 9.2
city 9.1
industrial 9.1
night 8.9
tropical 8.5
stay 8.4
evening 8.4
power 8.4
color 8.3
silhouette 8.3
island 8.2
environment 8.2
vacation 8.2
road 8.1
landmark 8.1
bay 8.1
coast 8.1
steel 8
mast 7.9
engineering 7.6
marine 7.6
staff 7.6
line 7.5
pattern 7.5
outdoors 7.5
smoke 7.4
part 7.3
futuristic 7.2
day 7.1
curve 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 99.8
ship 99.1
watercraft 85.6
boat 79.9
black 71.9
black and white 69.3
old 58.4
image 32.9
picture frame 7

Face analysis

Amazon

AWS Rekognition

Age 24-34
Gender Male, 69.9%
Calm 84.5%
Sad 10.7%
Angry 2.5%
Happy 1%
Confused 0.4%
Fear 0.3%
Surprised 0.3%
Disgusted 0.3%

AWS Rekognition

Age 13-21
Gender Female, 59.7%
Calm 46.7%
Sad 21.1%
Disgusted 7.7%
Happy 6.6%
Angry 6.3%
Fear 4.1%
Surprised 3.9%
Confused 3.7%

AWS Rekognition

Age 19-27
Gender Female, 68.9%
Sad 92.4%
Calm 2.3%
Fear 1.3%
Confused 1.3%
Surprised 0.9%
Disgusted 0.7%
Happy 0.7%
Angry 0.3%

AWS Rekognition

Age 6-14
Gender Female, 88.1%
Sad 42%
Calm 30.1%
Angry 8.6%
Disgusted 6.6%
Fear 5.5%
Confused 3.5%
Happy 2.3%
Surprised 1.3%

AWS Rekognition

Age 18-24
Gender Female, 52.4%
Calm 64.9%
Happy 7.7%
Fear 7%
Sad 6.8%
Disgusted 6.7%
Angry 2.7%
Confused 2.5%
Surprised 1.7%

Feature analysis

Amazon

Person 98.9%
Boat 83.3%

Captions

Microsoft

an old photo of a boat 46.7%
old photo of a boat 44.3%
an old photo of a city 44.2%

Text analysis

Amazon

535JA

Google

535JA
535JA