Human Generated Data

Title

Untitled (two men working on airplane)

Date

c. 1950

People

Artist: C. Bennette Moore, American 1879 - 1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21787

Human Generated Data

Title

Untitled (two men working on airplane)

People

Artist: C. Bennette Moore, American 1879 - 1939

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21787

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 99.4
Human 99.4
Person 99.4
Clothing 96
Apparel 96
Nature 74.8
Vehicle 74.4
Transportation 74.4
Airplane 73.7
Aircraft 73.7
Outdoors 67.2
Building 64.7
Suit 64.5
Coat 64.5
Overcoat 64.5
Face 63.6
Shorts 60
Wheel 58.4
Machine 58.4
Airfield 58
Airport 58
Helmet 56.7
Hangar 56.5

Clarifai
created on 2023-10-22

people 99.4
group together 98.5
monochrome 97.5
man 97.4
adult 96.2
two 95
three 90.7
group 89.2
aircraft 86.9
airplane 86.5
four 83.3
furniture 83.2
wear 82.7
several 82
vehicle 80.3
woman 78.8
street 77.6
five 76.7
portrait 72.2
transportation system 70

Imagga
created on 2022-03-11

astronaut 29.9
sky 26.8
travel 19
water 17.3
winter 17
snow 17
negative 16.2
landscape 15.6
cold 15.5
sand 13.8
cloud 13.8
sea 13.3
outdoor 13
film 12.8
man 12.8
building 12.7
tourism 12.4
beach 12.1
clouds 11.8
architecture 11.7
season 11.7
transportation 11.7
river 11.6
black 11.5
holiday 11.5
old 11.1
day 11
coast 10.8
ocean 10.1
person 9.9
photographic paper 9.9
environment 9.9
weather 9.8
outdoors 9.7
structure 9.6
vintage 9.1
protection 9.1
vacation 9
light 8.7
male 8.5
cloudy 8.4
ice 8.4
city 8.3
work 8.2
horizon 8.1
sun 8
urban 7.9
sunny 7.7
military 7.7
fog 7.7
stone 7.6
bay 7.5
house 7.5
wood 7.5
drawing 7.3
daily 7.3
industrial 7.3
sketch 7.2
road 7.2
world 7.1

Microsoft
created on 2022-03-11

text 99.1
person 94.9
man 90.9
black and white 78.2
clothing 75.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Male, 97.8%
Happy 86.8%
Sad 5.9%
Surprised 3.1%
Calm 1.5%
Angry 0.8%
Confused 0.7%
Disgusted 0.7%
Fear 0.6%

AWS Rekognition

Age 52-60
Gender Male, 52%
Happy 77.8%
Confused 10.5%
Surprised 4.5%
Sad 3.4%
Calm 1.7%
Disgusted 0.9%
Angry 0.8%
Fear 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Airplane
Person 99.4%
Person 99.4%
Airplane 73.7%

Categories

Captions

Microsoft
created on 2022-03-11

an old photo of a man 83.9%
old photo of a man 82.1%
a group of people posing for a photo 67.7%

Text analysis

Amazon

KODAK-A-ITW

Google

MJI7-- YT37A°2-- AGO
MJI7--
YT37A°2--
AGO