Human Generated Data

Title

Untitled (photograph of Gittings photo: "The Ponycart")

Date

c. 1970

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13366

Human Generated Data

Title

Untitled (photograph of Gittings photo: "The Ponycart")

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1970

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Vehicle 96.4
Transportation 96.4
Bike 96.4
Human 90.8
Person 90.8
Bicycle 84.6
Machine 77.8
Wheel 77.8
Sport 74.8
Sports 74.8
Cyclist 74.8
Spoke 66.1
Person 65.8

Imagga
created on 2022-01-23

balcony 84.1
barrow 40
handcart 31.2
wheeled vehicle 30.4
vehicle 20.6
man 20.1
chair 18.1
window 17.2
structure 17
building 16.1
people 14.5
male 14.2
architecture 13.3
interior 13.3
silhouette 13.2
wall 12.8
day 12.5
sitting 12
outdoors 11.2
conveyance 11.2
business 10.9
house 10.9
city 10.8
outdoor 10.7
couple 10.4
old 10.4
indoor 10
grand piano 10
urban 9.6
home 9.6
seat 9.4
lifestyle 9.4
glass 9.3
tricycle 9.2
modern 9.1
bench 9
sky 8.9
landscape 8.9
table 8.8
office 8.8
businessman 8.8
outside 8.5
person 8.5
black 8.4
wood 8.3
stringed instrument 8.3
style 8.2
cheerful 8.1
sunset 8.1
transportation 8.1
sun 8
piano 7.9
adult 7.9
women 7.9
love 7.9
musical instrument 7.8
room 7.7
relax 7.6
plant 7.5
keyboard instrument 7.3
water 7.3
color 7.2
trees 7.1
summer 7.1
rural 7
travel 7
indoors 7
scenic 7

Google
created on 2022-01-23

Wheel 96.6
Tire 95.6
Vehicle 93.8
Motor vehicle 91
Plant 89.7
Tree 87.2
Cart 83
Carriage 79.5
Adaptation 79.3
Rickshaw 77.9
Tints and shades 77.4
Rectangle 75.4
Leisure 74.9
Classic 74.8
Poster 72.5
Art 71.4
Vintage clothing 71.3
Wood 68.3
Sitting 67
Room 66.7

Microsoft
created on 2022-01-23

window 95.2
text 85.8
cart 71.9
chair 68.7
bench 67
furniture 64
room 42.5

Face analysis

Amazon

Google

AWS Rekognition

Age 0-6
Gender Female, 88.5%
Calm 97.6%
Sad 0.6%
Happy 0.5%
Confused 0.3%
Disgusted 0.3%
Surprised 0.3%
Fear 0.2%
Angry 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 90.8%
Bicycle 84.6%
Wheel 77.8%

Captions

Microsoft

a person riding a horse drawn carriage in front of a window 87.9%
a person riding a bicycle next to a window 68.6%
a person riding a horse next to a window 68.5%

Text analysis

Amazon

PAUL
PAUL LINWOOD GITINGS
LINWOOD
GITINGS
THE
THE PONYCART
PONYCART

Google

-
PAUL
LINWOOD
GITINGS
THE
PONYCART
THE PONYCART - PAUL LINWOOD GITINGS