Human Generated Data

Title

Untitled (people in old-fashioned clothes on street)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16065

Human Generated Data

Title

Untitled (people in old-fashioned clothes on street)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16065

Machine Generated Data

Tags

Amazon
created on 2022-03-25

Wheel 99.8
Machine 99.8
Person 99.8
Human 99.8
Person 99.4
Person 99.2
Person 97.5
Wheel 97.3
Person 97.1
Person 96.7
Person 93.2
Transportation 92.4
Person 92.2
Vehicle 92
Spoke 84.6
Person 80.7
Overcoat 62
Clothing 62
Coat 62
Apparel 62
Bike 58.6
Carriage 57.7
Tire 57
Wheel 56.5
Wagon 55.5
Bicycle 54.4

Clarifai
created on 2023-10-29

people 100
carriage 99.9
vehicle 99.4
group 99.1
wagon 99.1
cavalry 98.8
group together 98.8
transportation system 98.6
many 97.9
street 97.6
man 95.3
adult 94.6
stagecoach 94
driver 93.8
two 92.2
cart 90.9
monochrome 90.7
several 90.1
three 87.8
town 85.7

Imagga
created on 2022-03-25

carriage 100
wheelchair 38.6
transportation 34.1
chair 28.3
vehicle 27.7
wheel 27.4
bicycle 26.9
old 25.8
transport 25.6
street 24.9
wheeled vehicle 23.5
bike 22.5
tricycle 20.7
cart 19.7
seat 18.8
ride 17.5
man 15.5
road 15.4
city 15
travel 14.8
disabled 13.8
horse cart 13.2
care 13.2
support 12.8
health 12.5
equipment 12.2
wagon 11.8
cycle 11.7
horse 11.4
disability 10.9
active 10.8
urban 10.5
illness 10.5
outdoors 10.4
outside 10.3
help 10.2
architecture 10.2
handicapped 9.9
riding 9.8
hospital 9.4
house 9.2
outdoor 9.2
historic 9.2
vintage 9.1
activity 9
people 8.9
invalid 8.9
mobility 8.8
building 8.7
sick 8.7
antique 8.7
male 8.5
conveyance 8.4
sky 8.3
park 8.2
aged 8.1
medical 7.9
facility 7.9
impairment 7.9
pedal 7.9
wheels 7.8
aid 7.8
wall 7.7
drive 7.6
historical 7.5
window 7.5
tourism 7.4
sport 7.4
tourist 7.3
metal 7.2
color 7.2
history 7.2
summer 7.1
rural 7
country 7

Google
created on 2022-03-25

Microsoft
created on 2022-03-25

horse 98.8
text 98.8
drawn 96.3
outdoor 96.2
carriage 93.1
land vehicle 84.8
vehicle 84.6
street 82.4
black and white 79.9
white 77.1
old 74.3
pulling 73.2
people 64.8
wheel 63.2
person 52.7
cart 32.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-30
Gender Male, 99.9%
Calm 77.7%
Angry 10.7%
Confused 3.9%
Surprised 2.3%
Happy 2.1%
Sad 1.7%
Fear 1.2%
Disgusted 0.5%

AWS Rekognition

Age 23-33
Gender Male, 99.4%
Calm 69.4%
Disgusted 20%
Confused 4.4%
Happy 2.6%
Sad 1%
Fear 0.9%
Angry 0.8%
Surprised 0.8%

AWS Rekognition

Age 16-22
Gender Female, 55.7%
Calm 53.8%
Surprised 16.8%
Sad 14.1%
Happy 4.7%
Disgusted 3.4%
Fear 2.6%
Angry 2.5%
Confused 2%

Feature analysis

Amazon

Wheel
Person
Bicycle
Wheel 99.8%
Wheel 97.3%
Wheel 56.5%
Person 99.8%
Person 99.4%
Person 99.2%
Person 97.5%
Person 97.1%
Person 96.7%
Person 93.2%
Person 92.2%
Person 80.7%
Bicycle 54.4%

Categories

Text analysis

Amazon

DENNIS
Studebaker
DENNIS MOTU
MOTU

Google

LDERE DENNIS MOT Cudebaker
LDERE
DENNIS
MOT
Cudebaker