Human Generated Data

Title

Untitled (people next to helicopter)

Date

c. 1950

People

Artist: C. Bennette Moore, American 1879 - 1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21779

Human Generated Data

Title

Untitled (people next to helicopter)

People

Artist: C. Bennette Moore, American 1879 - 1939

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21779

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Clarifai
created on 2023-10-22

people 99.8
group together 99.4
aircraft 98
group 96.7
vehicle 96.2
adult 93.7
military 93.7
many 92.6
transportation system 92.4
airplane 90.6
war 90
navy 90
man 89.1
aviate 88.3
three 87.2
woman 87
monochrome 85.2
leader 83.2
watercraft 80.8
two 79.7

Imagga
created on 2022-03-11

stage 46.3
platform 34.3
billboard 33.4
signboard 26.3
structure 19.3
man 18.1
people 16.7
black 16.2
protection 15.5
person 15.4
night 15.1
mask 13.4
industrial 12.7
city 12.5
light 11.4
male 11.3
industry 11.1
dark 10.9
music 10.8
television 10.6
power 10.1
military 9.7
metal 9.7
factory 9.6
equipment 9.2
playing 9.1
danger 9.1
color 8.9
park 8.8
symbol 8.7
lifestyle 8.7
gas 8.7
work 8.6
party 8.6
war 8.6
entertainment 8.3
fun 8.2
technology 8.2
worker 8
destruction 7.8
adult 7.8
fight 7.7
silhouette 7.4
lights 7.4
weapon 7.3
dirty 7.2
transportation 7.2
player 7
modern 7

Microsoft
created on 2022-03-11

text 99.8
outdoor 94.3
black and white 79.5
person 76.5
transport 65.1
music 56.3
posing 35

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Male, 99.8%
Confused 51.7%
Calm 24.4%
Sad 7.2%
Disgusted 4.6%
Surprised 4.1%
Happy 3.9%
Fear 2.1%
Angry 2%

AWS Rekognition

Age 41-49
Gender Male, 81.4%
Calm 92.2%
Sad 3%
Happy 1.8%
Confused 1.4%
Surprised 0.9%
Disgusted 0.4%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 38-46
Gender Male, 55.6%
Calm 99.2%
Sad 0.3%
Happy 0.2%
Surprised 0.2%
Confused 0.1%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 31-41
Gender Female, 92.6%
Calm 93.3%
Happy 5.6%
Surprised 0.5%
Sad 0.4%
Disgusted 0.1%
Fear 0.1%
Confused 0.1%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Helicopter
Airplane
Person 99.4%
Person 99.3%
Person 99.3%
Person 94.7%
Person 85.9%
Helicopter 75.9%
Airplane 60%

Categories

Text analysis

Amazon

228B
OLEUM
OLEUM BELL-HELICOPTE
BELL-HELICOPTE
SAI
KODAEOL

Google

228B CLEUM BELL-HELICOPTER
228B
CLEUM
BELL-HELICOPTER