Human Generated Data

Title

Untitled (medevac helicopter, Vietnam)

Date

1967-68

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.1.201

Human Generated Data

Title

Untitled (medevac helicopter, Vietnam)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967-68

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.1.201

Machine Generated Data

Tags

Amazon
created on 2023-10-25

Person 96.7
Person 96.6
Person 93.8
Person 90.6
Adult 90.6
Male 90.6
Man 90.6
Person 90.4
Baby 90.4
Person 89.9
Baby 89.9
Person 88.7
Person 88.5
Person 88.3
Person 88.3
Baby 88.3
Person 87.3
Baby 87.3
Face 86.1
Head 86.1
Person 86.1
Person 85.6
Person 82.9
Person 82.1
Person 80.5
Adult 80.5
Male 80.5
Man 80.5
Person 78.8
Person 78.3
Computer Hardware 78.1
Electronics 78.1
Hardware 78.1
Monitor 78.1
Screen 78.1
Person 75.7
Baby 75.7
Photographic Film 66.4
Art 64.6
Collage 64.6
Person 63.8
Person 63.8
Person 62.5
Person 61.9
Person 58

Clarifai
created on 2019-02-18

television 99.2
people 99
screen 97.7
group 96.8
no person 96.7
many 96.6
man 95.3
display 95.2
exhibition 94.5
movie 93.4
technology 93.2
adult 92.7
picture frame 90.9
landscape 90.7
indoors 90
injury 89.8
collection 89.7
one 89.6
woman 89.6
vehicle 89

Imagga
created on 2019-02-18

case 100
furniture 21.1
food 17.5
buffet 16.1
restaurant 14.8
furnishing 14.7
design 11.8
interior 11.5
fruit 10.6
glass 10.2
home 9.6
architecture 9.4
cabinet 9.3
dinner 9.2
traditional 9.1
business 9.1
building 8.8
luxury 8.6
culture 8.5
money 8.5
plate 8.5
modern 8.4
color 8.3
technology 8.1
kitchen 8
close 8
china cabinet 7.9
flowers 7.8
black 7.8
table 7.8
store 7.5
equipment 7.5
place 7.4
oil 7.4
shop 7.4
shelf 7.3
meal 7.3
celebration 7.2
colorful 7.2

Google
created on 2019-02-18

Microsoft
created on 2019-02-18

indoor 91.8
different 43.8
art 43.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-33
Gender Male, 99.8%
Sad 80.3%
Surprised 32.3%
Calm 24.7%
Confused 11.1%
Fear 6.4%
Disgusted 1.8%
Angry 1.2%
Happy 0.4%

AWS Rekognition

Age 26-36
Gender Male, 99.6%
Calm 77.8%
Happy 7.6%
Surprised 6.9%
Fear 6.8%
Sad 4.5%
Confused 2.6%
Disgusted 1.9%
Angry 1.1%

AWS Rekognition

Age 22-30
Gender Male, 70.6%
Calm 71.6%
Sad 46.4%
Surprised 6.3%
Fear 6.2%
Angry 0.7%
Disgusted 0.5%
Confused 0.3%
Happy 0.3%

AWS Rekognition

Age 23-33
Gender Male, 98.9%
Calm 97.4%
Surprised 6.4%
Fear 6%
Sad 2.3%
Angry 0.5%
Disgusted 0.4%
Confused 0.4%
Happy 0.3%

AWS Rekognition

Age 12-20
Gender Male, 98.2%
Calm 80.7%
Sad 15.6%
Surprised 6.6%
Fear 6.1%
Happy 1.3%
Disgusted 0.8%
Angry 0.6%
Confused 0.4%

AWS Rekognition

Age 20-28
Gender Male, 96.9%
Calm 78.8%
Surprised 7.9%
Fear 6.6%
Sad 5.2%
Angry 4.1%
Disgusted 2.3%
Happy 2%
Confused 1.7%

Feature analysis

Amazon

Person 96.7%
Adult 90.6%
Male 90.6%
Man 90.6%
Baby 90.4%
Monitor 78.1%

Categories

Imagga

text visuals 58.3%
paintings art 22.3%
interior objects 17.1%

Captions

Text analysis

Amazon

3
NYA
THE
Y
THA
-
NYA THE MPOOM
WTH NYA - THA WOOD
NYA Y IMI МУЛОЧ
WOOD
IMI
IMA
MPOOM
МУЛОЧ
WTH
HAGOY
CATH
ROU

Google

Cu 10 13
Cu
10
13