Human Generated Data

Title

Untitled (two girls sitting with two dogs on ledge in yard)

Date

c. 1940

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12503

Human Generated Data

Title

Untitled (two girls sitting with two dogs on ledge in yard)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12503

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 96.1
Human 96.1
Person 95.1
Person 87.9
Nature 83.2
Outdoors 80.1
Person 73.9
Helmet 72.7
Clothing 72.7
Apparel 72.7
Canine 70.6
Animal 70.6
Mammal 70.6
Tree 68.9
Plant 68.9
Pet 66.8
Tent 66.6
Vegetation 66.4
Jacuzzi 66.3
Tub 66.3
Hot Tub 66.3
Girl 58.6
Female 58.6

Clarifai
created on 2023-10-27

people 99.8
child 99.3
boy 97.6
vehicle 95.5
two 94.9
adult 94.9
one 93.7
man 93.6
monochrome 93
campsite 92.8
war 92.1
tree 91.2
wear 90.4
group 90
transportation system 89.3
woman 89
home 87.2
son 86.7
analogue 86.3
recreation 84.4

Imagga
created on 2022-01-29

cannon 51.9
vehicle 33.3
gun 33.2
tank 30.2
weaponry 28.3
weapon 25.4
high-angle gun 24.5
sky 24.4
artillery 24.3
tracked vehicle 23.7
military vehicle 23.7
gunnery 23.5
landscape 17.8
water 15.3
armament 15.3
armored vehicle 15.1
wheeled vehicle 14.4
track 13.9
cloud 13.8
travel 13.4
river 13.3
outdoor 13
tie 12.6
power 12.6
sunset 12.6
old 12.5
conveyance 12.3
scenery 11.7
military 11.6
smoke 11.2
danger 10.9
half track 10.7
environment 10.7
scene 10.4
stone 10.3
sand 10.2
clouds 10.1
brace 9.7
summer 9.6
rifle 9.6
day 9.4
tree 9.2
ocean 9.1
black 9
vacation 9
transportation 9
outdoors 9
sun 8.8
machine 8.8
scenic 8.8
army 8.8
war 8.7
sea 8.6
park 8.2
peace 8.2
industrial 8.2
road 8.1
building 8
architecture 7.8
steam 7.8
industry 7.7
cloudy 7.5
city 7.5
tourism 7.4
land 7.4
device 7.3
strengthener 7.3
coast 7.2
field artillery 7

Microsoft
created on 2022-01-29

weapon 98.9
black and white 97.3
sitting 94.9
text 88.1
monochrome 87.7
rifle 72.8
tank 71.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Female, 97.7%
Calm 69.4%
Sad 14.7%
Surprised 6.6%
Happy 3.3%
Angry 2.9%
Disgusted 1.8%
Confused 0.8%
Fear 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Helmet
Tent
Jacuzzi
Person 96.1%
Person 95.1%
Person 87.9%
Person 73.9%
Helmet 72.7%
Tent 66.6%
Jacuzzi 66.3%

Text analysis

Amazon

17
HX
w
I د w 17 HX
NYEN
IE
NYEN DH
د
DH
I