Human Generated Data

Title

Untitled (soldiers working on helicopter, Vietnam)

Date

1967-68

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.1.232

Human Generated Data

Title

Untitled (soldiers working on helicopter, Vietnam)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967-68

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.1.232

Machine Generated Data

Tags

Amazon
created on 2023-10-25

Art 99.6
Collage 99.6
Adult 98.3
Male 98.3
Man 98.3
Person 98.3
Adult 98
Male 98
Man 98
Person 98
Person 97.1
Adult 94.7
Male 94.7
Man 94.7
Person 94.7
Person 93.8
Person 91.2
Baby 91.2
Face 91
Head 91
Person 90.6
Adult 88.2
Person 88.2
Bride 88.2
Female 88.2
Wedding 88.2
Woman 88.2
Adult 84.9
Male 84.9
Man 84.9
Person 84.9
Adult 84.9
Male 84.9
Man 84.9
Person 84.9
Person 84.7
Baby 84.7
Aircraft 83.4
Airplane 83.4
Transportation 83.4
Vehicle 83.4
Person 73.1
Indoors 63.7
Furniture 56.3
Photo Booth 55.6
Photographic Film 55.3

Clarifai
created on 2019-02-18

television 99.4
vehicle 98.2
technology 98.2
equipment 98
industry 97.4
storage 96.6
movie 96.5
people 96.4
transportation system 95.3
many 93.2
video recording 93
old 92.8
group 92.7
retro 92.4
rack 92
car 91.4
nostalgia 91.1
man 90.9
no person 90.8
display 90.1

Imagga
created on 2019-02-18

case 25.3
equipment 23.8
furniture 19.3
technology 18.5
electronic equipment 16.6
furnishing 14.1
design 14
buffet 13.5
digital 12.9
device 12.6
window 12.3
cassette 11.9
film 10.4
black 10.2
monitor 10.1
retro 9.8
old 9.7
web site 9.7
control panel 9.5
industry 9.4
texture 9
interior 8.8
cabinet 8.6
television 8.5
amplifier 8.4
sound 8.4
modern 8.4
power 8.4
computer 8.1
cassette tape 8
close 8
art 7.9
display 7.9
colorful 7.9
container 7.9
architecture 7.8
web 7.6
communication 7.5
screen 7.5
pattern 7.5
shelf 7.4
classic 7.4
style 7.4
network 7.4
object 7.3
frame 7.2
radio receiver 7.1
glass 7.1

Google
created on 2019-02-18

Microsoft
created on 2019-02-18

different 48.1
art 48.1
window 36.1
black and white 19.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 82.5%
Angry 47.3%
Happy 21.1%
Calm 8.9%
Fear 8.6%
Disgusted 6.7%
Surprised 6.7%
Sad 6.2%
Confused 1.3%

AWS Rekognition

Age 25-35
Gender Male, 93.6%
Happy 74.8%
Fear 7.5%
Surprised 7.2%
Angry 5.4%
Sad 4.7%
Calm 3.4%
Disgusted 3.3%
Confused 1.9%

AWS Rekognition

Age 29-39
Gender Male, 98.3%
Calm 77.6%
Surprised 8%
Sad 6.8%
Fear 6.5%
Angry 5%
Confused 1.8%
Disgusted 1.7%
Happy 0.8%

AWS Rekognition

Age 28-38
Gender Male, 100%
Sad 94.7%
Calm 47.6%
Surprised 6.9%
Fear 6%
Confused 1.7%
Angry 1%
Happy 0.7%
Disgusted 0.6%

AWS Rekognition

Age 33-41
Gender Male, 77.9%
Angry 70%
Happy 16%
Surprised 6.5%
Fear 6.3%
Sad 4.7%
Disgusted 4.2%
Confused 1.8%
Calm 0.8%

AWS Rekognition

Age 26-36
Gender Male, 99.8%
Calm 82.2%
Surprised 6.7%
Fear 6%
Confused 5.3%
Disgusted 4.7%
Sad 4.1%
Angry 1.6%
Happy 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 98.3%
Male 98.3%
Man 98.3%
Person 98.3%
Baby 91.2%
Bride 88.2%
Female 88.2%
Woman 88.2%
Airplane 83.4%

Categories

Imagga

text visuals 99.9%

Captions

Text analysis

Amazon

AG
TE
SON