Human Generated Data

Title

Untitled (soldiers walking through fields, Vietnam)

Date

1967-68

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.90

Human Generated Data

Title

Untitled (soldiers walking through fields, Vietnam)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967-68

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.90

Machine Generated Data

Tags

Amazon
created on 2023-10-25

Art 99.9
Collage 99.9
Adult 99
Male 99
Man 99
Person 99
Adult 98.9
Male 98.9
Man 98.9
Person 98.9
Adult 98.2
Male 98.2
Man 98.2
Person 98.2
Face 97.6
Head 97.6
Person 91.1
Person 87
Person 83.6
Person 73.2
Person 71.8
Person 69.5
Photographic Film 59.2
Clothing 56.8
Hat 56.8

Clarifai
created on 2023-10-15

negative 100
movie 99.8
filmstrip 99.8
exposed 99.8
cinematography 99
slide 98.5
photograph 97.9
bobbin 96.2
emulsion 95.8
monochrome 95.8
people 94.9
video 94.6
margin 93.8
old 93.3
desktop 93.2
collage 92.7
projection 92.5
dirty 92.5
adult 92
noisy 91.4

Imagga
created on 2019-02-03

negative 100
film 93.9
photographic paper 62.2
photographic equipment 41.6
screen 30.7
computer 30.5
background 27
laptop 24.1
business 23.1
display 20.8
technology 17.8
digital 17
office 16.9
vintage 16.5
work 16.5
money 16.1
monitor 15.6
finance 15.2
keyboard 15.1
equipment 14.9
retro 14.7
close 14.3
financial 14.2
currency 13.4
paper 13.3
old 13.2
closeup 12.1
camera 12
modern 11.9
cash 11.9
frame 11.6
notebook 11.6
design 11.2
electronic device 11.1
art 10.8
black 10.8
movie 10.6
mail 10.5
device 10.4
grunge 10.2
graphic 10.2
dollar 10.2
professional 10.1
letter 10.1
person 10
bank 10
postmark 9.9
postage 9.8
postal 9.8
web site 9.7
job 9.7
stamp 9.7
object 9.5
man 9.4
electronic 9.3
banking 9.2
data 9.1
studio 9.1
border 9
wealth 9
people 8.9
noise 8.8
slide 8.8
bills 8.7
strip 8.7
dollars 8.7
bill 8.6
space 8.5
desk 8.5
savings 8.4
hand 8.3
network 8.3
entertainment 8.3
rough 8.2
dirty 8.1
silver 8
circa 7.9
banknotes 7.8
typing 7.8
education 7.8
blank 7.7
damaged 7.6
web 7.6
one 7.5
student 7.2
aged 7.2
information 7.1
working 7.1

Google
created on 2019-02-03

Microsoft
created on 2019-02-03

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 21-29
Gender Male, 90.3%
Calm 33.7%
Surprised 30%
Fear 15.2%
Disgusted 10.9%
Angry 8.1%
Happy 4.8%
Sad 3.4%
Confused 2%

AWS Rekognition

Age 38-46
Gender Male, 99.3%
Calm 98.1%
Surprised 6.3%
Fear 5.9%
Sad 2.3%
Disgusted 0.4%
Confused 0.4%
Happy 0.3%
Angry 0.2%

AWS Rekognition

Age 16-24
Gender Female, 88.1%
Calm 44.1%
Fear 13.9%
Disgusted 10.8%
Surprised 9.8%
Happy 9.5%
Sad 6.1%
Angry 6%
Confused 2.1%

Feature analysis

Amazon

Adult 99%
Male 99%
Man 99%
Person 99%
Hat 56.8%

Categories

Imagga

interior objects 99.9%

Text analysis

Amazon

ES
AS
AES
ATS
AAS
TS
.....
x
A2S
AaS
as
LAT
ETEW
MACON
MIIT ИЛО x LAT MACON
XAGOX
ИЛО
MIIT
clin
133A2