Human Generated Data

Title

Untitled (Yogyakarta, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5054

Human Generated Data

Title

Untitled (Yogyakarta, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5054

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Machine 98.7
Wheel 98.7
Wheel 98.7
Person 98.1
Wheel 98
Wheel 97.9
Bicycle 97.6
Transportation 97.6
Vehicle 97.6
Wheel 97.4
Wheel 97.3
Person 96.6
Person 96.6
Wheel 95.7
Wheel 95.6
Wheel 94.5
Person 93.6
Bicycle 92.7
Person 92.4
Person 91.8
Person 91.7
Person 91.4
Wheel 91.1
Person 90.6
Wheel 89.5
Person 88.5
Tricycle 86.9
Cycling 82.4
Sport 82.4
Person 81.7
Face 78.7
Head 78.7
Motorcycle 78.3
Person 78
Wheel 77.9
Helmet 77.2
Person 73.4
Person 72.9
Adult 72.9
Female 72.9
Woman 72.9
Wheel 72
Spoke 71.4
Person 61.2
Bicycle 60.9
Road 57.7
City 55.5
Street 55.5
Urban 55.5

Clarifai
created on 2018-05-10

vehicle 100
transportation system 99.9
bike 99.8
biker 99.8
people 99.8
street 99.7
group together 99.7
wheel 99.5
road 99
motorcyclist 99
cyclist 98.2
tricycle 98.2
seated 98.1
driver 97.9
group 97.8
many 97.5
roll along 97.5
man 97
traffic 96.6
motorbike 95.8

Imagga
created on 2023-10-07

jinrikisha 100
cart 100
wagon 100
wheeled vehicle 76.4
vehicle 49.4
wheelchair 37
transportation 35
wheel 32.1
bicycle 29.3
bike 29.3
outdoors 25.4
disabled 23.7
street 23
old 22.3
chair 21
transport 21
man 20.2
ride 17.5
care 17.3
carriage 16.8
road 16.3
outside 16.3
people 16.2
handicapped 15.8
help 14.9
park 14.8
disability 14.8
city 14.1
senior 14.1
health 13.9
cycle 13.7
active 13.5
mobility 12.7
horse 12.3
outdoor 12.2
support 12.2
male 12.1
men 12
travel 12
handicap 11.9
riding 11.7
illness 11.5
impairment 10.9
person 10.6
seat 10.6
urban 10.5
traffic 10.5
lifestyle 10.1
invalid 9.9
sport 9.9
activity 9.9
wheels 9.8
driver 9.7
adult 9.7
retired 9.7
physical 9.7
sick 9.7
summer 9.6
drive 9.5
equipment 9
disable 8.9
cycling 8.9
medical 8.8
elderly 8.6
husband 8.6
wife 8.5
hospital 8.5
mature 8.4
aged 8.1
recreation 8.1
invalidness 7.9
disablement 7.9
invalidity 7.9
facility 7.9
recovery 7.9
aid 7.8
driving 7.7
retirement 7.7
town 7.4
sports 7.4
training 7.4
vacation 7.4
speed 7.3
family 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 99.9
people 69.3
transport 66.8
old 41.6
carriage 39.6
several 10.1

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 26-36
Gender Male, 100%
Disgusted 92.3%
Surprised 6.4%
Fear 6%
Confused 2.8%
Sad 2.7%
Calm 1.4%
Angry 1%
Happy 0.3%

AWS Rekognition

Age 23-33
Gender Male, 99.5%
Calm 86.2%
Surprised 12.2%
Fear 6%
Sad 2.6%
Angry 2%
Happy 0.7%
Disgusted 0.6%
Confused 0.3%

AWS Rekognition

Age 23-33
Gender Male, 61.7%
Angry 67.4%
Calm 13.8%
Surprised 7.3%
Confused 6.9%
Fear 6.2%
Sad 6.1%
Disgusted 1.3%
Happy 0.2%

AWS Rekognition

Age 25-35
Gender Female, 88.8%
Calm 97.5%
Surprised 6.3%
Fear 5.9%
Sad 2.5%
Angry 0.8%
Happy 0.2%
Confused 0.1%
Disgusted 0.1%

AWS Rekognition

Age 33-41
Gender Male, 100%
Calm 54.9%
Surprised 12.6%
Confused 8.8%
Sad 8.8%
Fear 6.5%
Angry 6%
Happy 5.3%
Disgusted 4.1%

Microsoft Cognitive Services

Age 11
Gender Male

Feature analysis

Amazon

Wheel 98.7%
Person 98.1%
Bicycle 97.6%
Motorcycle 78.3%
Helmet 77.2%
Adult 72.9%
Female 72.9%
Woman 72.9%

Text analysis

Amazon

PA
115