Human Generated Data

Title

Untitled (three men in tent holding shotguns)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14383

Human Generated Data

Title

Untitled (three men in tent holding shotguns)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99
Human 99
Person 99
Person 98.2
Camping 88.7
Leisure Activities 77.1
Tent 75.9

Imagga
created on 2022-01-29

building 34.6
structure 32.3
old 31.3
wall 30.4
greenhouse 28.9
grunge 27.2
vintage 24
antique 20.8
landscape 17.8
texture 17.4
frame 16.6
retro 16.4
house 15.9
grungy 15.2
rural 15
forest 14.8
car 14.4
freight car 14.2
paper 14.1
art 13.7
trees 13.3
country 13.2
snow 13.1
architecture 12.8
graffito 12.7
negative 12.5
space 12.4
tree 12.3
winter 11.9
film 11.8
aged 11.8
decoration 11.7
dirty 11.7
season 11.7
wood 11.7
sky 11.5
scenic 11.4
weathered 11.4
garage 11.1
grain 11.1
rough 10.9
black 10.8
window 10.7
wheeled vehicle 10.7
rustic 10.6
textured 10.5
ancient 10.4
countryside 10
border 9.9
travel 9.9
sepia 9.7
autumn 9.7
brown 9.6
artistic 9.6
empty 9.4
field 9.2
park 9.2
road 9
roof 9
pattern 8.9
mist 8.7
torn 8.7
light 8.7
fog 8.7
vehicle 8.6
blank 8.6
old fashioned 8.6
screen 8.5
tourism 8.2
scenery 8.1
history 8.1
home 8
barn 7.9
canvas tent 7.9
design 7.9
text 7.9
album 7.8
cold 7.8
collage 7.7
construction 7.7
stained 7.7
fence 7.7
mountains 7.4
historic 7.3
paint 7.2
fall 7.2
morning 7.2
fantasy 7.2

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

outdoor 97.1
text 80.9
person 77
drawing 68.2
black and white 64.8
clothing 62.2
old 62

Face analysis

Amazon

Google

AWS Rekognition

Age 48-56
Gender Male, 99.9%
Calm 90.9%
Happy 7.3%
Sad 0.8%
Surprised 0.4%
Confused 0.2%
Angry 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 23-33
Gender Female, 77.8%
Calm 53.6%
Sad 20.5%
Happy 19.6%
Confused 3.2%
Surprised 1.3%
Angry 0.7%
Fear 0.6%
Disgusted 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Tent 75.9%

Captions

Microsoft

an old photo of a person 60.3%
old photo of a person 54.6%
a person sitting in a field 41.8%

Text analysis

Amazon

8
MJIR
ARAA
MJIR YT33AS ARAA
YT33AS

Google

YT3RA
2
MJI7
MJI7 YT3RA 2 A73A
A73A