Human Generated Data

Title

Untitled (boy and girl riding tricycles inside)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17379

Human Generated Data

Title

Untitled (boy and girl riding tricycles inside)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.6
Human 99.6
Vehicle 99.2
Transportation 99.2
Bicycle 99.2
Bike 99.2
Wheel 99
Machine 99
Wheel 98
Wheel 96.9
Person 96.5
Shorts 86.4
Clothing 86.4
Apparel 86.4
Furniture 74.3
Chair 68.1
Cyclist 65
Sport 65
Sports 65
Wheel 64.9
Chair 64.3
Face 61.4
Photo 61.4
Portrait 61.4
Photography 61.4
Floor 60.9
Spoke 57
Tricycle 55.3
Wheel 51

Imagga
created on 2022-02-26

wheeled vehicle 29.9
shopping cart 27.8
handcart 22.3
basket 22.2
chair 21.3
container 20.8
people 18.9
wheelchair 18.4
person 17.9
man 17.5
shopping 16.5
blackboard 14.7
store 14.2
buy 14.1
cart 13.6
sax 13.4
tricycle 13.3
market 13.3
seat 13.1
conveyance 13
shopping basket 12.9
shop 12.8
business 12.7
supermarket 12
men 12
vehicle 12
device 11.8
lifestyle 11.6
male 11.3
sport 11.3
outdoors 11.2
buying 10.6
urban 10.5
metal 10.5
empty 10.3
work 10.2
transport 10
exercise 10
ball 9.8
fun 9.7
planner 9.7
black 9.6
body 9.6
retail 9.5
women 9.5
3d 9.3
sale 9.2
city 9.1
hand 9.1
technology 8.9
purchase 8.7
furniture 8.5
exercise bike 8.4
holding 8.2
human 8.2
fitness 8.1
transportation 8.1
structure 8
steel 7.9
working 7.9
trolley 7.9
mall 7.8
adult 7.8
athlete 7.7
health 7.6
building 7.6
push 7.6
wheel 7.5
active 7.4
competition 7.3
interior 7.1
day 7.1
modern 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 94.3
outdoor 91
person 79.3
black and white 77
wheel 75.7
bicycle 74.7
clothing 64.7
cart 63.2
street 57.7

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Male, 99.5%
Surprised 99%
Calm 0.4%
Fear 0.3%
Happy 0.1%
Angry 0.1%
Disgusted 0.1%
Sad 0%
Confused 0%

AWS Rekognition

Age 30-40
Gender Female, 71.7%
Calm 89.4%
Happy 3.6%
Fear 2.4%
Surprised 1.7%
Sad 1%
Disgusted 0.9%
Confused 0.6%
Angry 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Bicycle 99.2%
Wheel 99%
Chair 68.1%

Captions

Microsoft

a person riding on the back of a bicycle 48.4%
a person riding on the back of a bicycle 48.3%
a group of people riding on the back of a bicycle 45.8%

Text analysis

Amazon

134-394
MINNESOTA
SJ
CLAPP S
KODOK-SEELA

Google

134-394
134-394