Human Generated Data

Title

Untitled (two people with wedding cake)

Date

1970s copy negative from a c. 1945 negative

People

Artist: C. Bennette Moore, American 1879 - 1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21775

Human Generated Data

Title

Untitled (two people with wedding cake)

People

Artist: C. Bennette Moore, American 1879 - 1939

Date

1970s copy negative from a c. 1945 negative

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21775

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Human 97.6
Person 97.2
Person 87.1
Hat 85.7
Clothing 85.7
Apparel 85.7
Sailor Suit 59.5
Screen 58.5
Electronics 58.5
Monitor 56.7
Display 56.7
Military 55.9

Clarifai
created on 2023-10-23

people 99.9
group 98.8
monochrome 98.5
group together 98.4
two 97.7
adult 96.4
several 95.9
vehicle 95.7
man 95.4
many 95
watercraft 94.9
transportation system 94.5
war 93.9
three 93.7
furniture 93.6
one 93.2
military 93.1
wear 92.5
no person 89.4
warship 87.7

Imagga
created on 2022-03-11

architecture 25.9
building 23.2
travel 19.7
man 16.2
old 16
city 15
vehicle 14.9
male 14.2
construction 12.8
sky 12.7
history 12.5
boat 12.5
people 11.7
ship 11.7
military vehicle 11.5
tourism 11.5
water 10.7
steel 10.6
shop 10.5
office 10.4
famous 10.2
sea 10.2
transportation 9.9
tower 9.8
metal 9.6
uniform 9.6
engineer 9.5
passenger 9.5
house 9.4
car 9.4
church 9.2
landmark 9
culture 8.5
temple 8.5
industry 8.5
person 8.5
business 8.5
adult 8.5
monument 8.4
vessel 8.3
mercantile establishment 8.3
vintage 8.3
outdoors 8.2
vacation 8.2
half track 8.1
bakery 7.9
urban 7.9
tourist 7.8
men 7.7
harbor 7.7
sculpture 7.6
capital 7.6
military uniform 7.6
wheeled vehicle 7.6
tracked vehicle 7.5
ocean 7.5
landscape 7.4
conveyance 7.3
religion 7.2
work 7.1

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

text 99.6
ship 92.4
indoor 86.8
black 84.2
black and white 81.6
vehicle 72.2
old 68.1
tank 50.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 97.4%
Happy 90.2%
Sad 5.9%
Disgusted 1.1%
Confused 0.9%
Surprised 0.6%
Calm 0.6%
Angry 0.5%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person
Hat
Person 97.2%
Person 87.1%
Hat 85.7%

Captions

Text analysis

Amazon

GXO
DIG
Light