Human Generated Data

Title

Untitled (Phyllis Moore Stoll and Eugenie Stoll on deck of cruise ship)

Date

c. 1950

People

Artist: C. Bennette Moore, American 1879 - 1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21745

Human Generated Data

Title

Untitled (Phyllis Moore Stoll and Eugenie Stoll on deck of cruise ship)

People

Artist: C. Bennette Moore, American 1879 - 1939

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Apparel 99.7
Clothing 99.7
Human 99.3
Person 99.3
Furniture 93.6
Robe 89.6
Fashion 89.6
Wedding 83.3
Person 82.7
Person 82.3
Gown 80.8
Bridegroom 76.5
Person 76.4
Wedding Gown 74.1
Table 73.7
Chair 73.4
Transportation 71.2
Vehicle 70.5
Dining Table 65.7
Female 63.4
Photography 61.8
Face 61.8
Photo 61.8
Portrait 61.8
Overcoat 61.1
Suit 61.1
Coat 61.1
Bride 58.6
Home Decor 56

Imagga
created on 2022-03-11

deck 37.3
chair 37.1
building 30.5
seat 28.8
architecture 25.9
structure 23.3
transportation 22.4
passenger 22.1
urban 21.9
travel 21.8
city 21.6
interior 21.2
modern 21
business 18.8
water 16.7
airport 15.6
restaurant 15.4
window 15.3
steel 15.1
glass 14.8
hall 14.7
indoors 14.1
people 13.9
office 13.9
reflection 13.8
departure 13.8
transport 13.7
support 13.6
furniture 13.5
sea 13.3
gate 13.2
empty 12.9
inside 12.9
station 12.6
perspective 12.2
table 12.1
boat 12.1
metal 12.1
light 12
cafeteria 12
construction 12
device 11.9
patio 11.8
mall 11.7
public 11.7
boats 11.6
area 11.6
room 11
sky 10.2
ocean 10.1
silhouette 9.9
tourism 9.9
vacation 9.8
journey 9.4
floor 9.3
summer 9
tables 8.9
sun 8.9
walkway 8.8
chairs 8.8
built 8.7
scene 8.7
sunny 8.6
wall 8.6
place 8.4
house 8.4
design 7.9
stair 7.9
ceiling 7.9
corridor 7.9
day 7.8
stairs 7.8
life 7.8
industry 7.7
windows 7.7
wood 7.5
industrial 7.3
tourist 7.3
door 7.2
holiday 7.2
balcony 7.1
work 7.1

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

text 98.3
furniture 97.4
table 93.3
black and white 92.1
chair 89.3
man 81.7
clothing 80.1
person 75.4

Face analysis

Amazon

Google

AWS Rekognition

Age 45-53
Gender Male, 95.8%
Happy 92.3%
Calm 3.6%
Sad 2.1%
Disgusted 0.7%
Surprised 0.6%
Confused 0.3%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 23-31
Gender Female, 95.1%
Calm 28.4%
Happy 27.4%
Sad 23.5%
Fear 13.3%
Confused 2%
Disgusted 1.9%
Surprised 1.8%
Angry 1.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.3%
Chair 73.4%

Captions

Microsoft

a person standing in front of a building 76.5%
a person that is standing in front of a building 72.4%
a person standing next to a building 71.2%

Text analysis

Amazon

ST

Google

YT37A°2- XAO
YT37A°2-
XAO