Human Generated Data

Title

Untitled (two boys riding in cart)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17302

Human Generated Data

Title

Untitled (two boys riding in cart)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 97.8
Human 97.8
Person 97.1
Vehicle 90
Transportation 90
Apparel 82.1
Clothing 82.1
Carriage 81.7
Machine 80.8
Wheel 80.8
Motorcycle 67.8
Wagon 62.2
Tricycle 62.1
Beach Wagon 58.1

Imagga
created on 2022-02-26

conveyance 100
sidecar 100
vehicle 29
car 23.6
man 22.8
outdoors 22.4
people 21.2
drive 19.8
male 18.4
driver 17.5
sport 17.3
road 17.2
transportation 17
speed 16.5
outside 16.2
wheeled vehicle 15.6
adult 15.5
outdoor 15.3
lifestyle 15.2
wheel 14.1
wheelchair 13.7
race 13.4
tricycle 13.2
person 13
fun 12.7
bike 12.7
ride 12.6
motor 12.6
leisure 12.4
boy 12.2
fast 12.2
men 12
transport 11.9
seat 11.5
auto 11.5
old 11.1
lawn mower 11.1
motorcycle 11
child 10.9
automobile 10.5
sitting 10.3
motion 10.3
day 10.2
street 10.1
active 9.9
motorbike 9.9
attractive 9.8
riding 9.8
vintage 9.1
chair 8.8
wheels 8.8
happy 8.8
driving 8.7
grass 8.7
smiling 8.7
track 8.6
smile 8.5
travel 8.4
sports 8.3
sky 8.3
garden tool 8.1
recreation 8.1
cart 7.9
racing 7.8
play 7.8
helmet 7.7
extreme 7.7
machine 7.6
power 7.6
blur 7.4
landscape 7.4
action 7.4
park 7.4
tool 7.4
competition 7.3
color 7.2
women 7.1
work 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

outdoor 99.8
grass 96.9
black and white 95.1
wheel 94.4
person 83.5
tire 82.6
monochrome 72.8
transport 69.9
cart 68.1
black 67.1
text 62.6
white 60.9
man 55.5
handcart 48.4
old 40.6

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 82.4%
Calm 90.8%
Sad 7.5%
Happy 0.9%
Disgusted 0.2%
Fear 0.2%
Angry 0.1%
Surprised 0.1%
Confused 0.1%

AWS Rekognition

Age 20-28
Gender Male, 99.6%
Calm 98.9%
Surprised 0.8%
Angry 0.1%
Happy 0.1%
Sad 0.1%
Fear 0.1%
Disgusted 0.1%
Confused 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.8%
Wheel 80.8%
Motorcycle 67.8%

Captions

Microsoft

a person riding a bike down a dirt road 66.8%
a person riding a bike down a dirt road 66.7%
a person riding a bike down a dirt road in front of a building 53.4%

Text analysis

Amazon

603
MJI7
MJI7 YT37AS ООЗИА
YT37AS
ООЗИА
10