Human Generated Data

Title

Untitled (girl pulling children in wagon)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17669

Human Generated Data

Title

Untitled (girl pulling children in wagon)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.5
Human 99.5
Person 98.4
Person 98.1
Clothing 95
Apparel 95
Dress 90.6
Wheel 89.9
Machine 89.9
Vehicle 86
Transportation 86
Mammal 83.1
Pet 83.1
Dog 83.1
Animal 83.1
Canine 83.1
Female 72.2
Girl 63.1
Bike 62.7
Bicycle 62.7
Portrait 62.2
Photography 62.2
Photo 62.2
Face 62.2
Wagon 62.1
People 57.7
Carriage 56.7
Child 55.1
Kid 55.1

Imagga
created on 2022-02-26

wheeled vehicle 77.3
tricycle 67.9
vehicle 53.9
barrow 36.9
conveyance 35.3
handcart 29.6
man 21.5
outdoors 19.9
people 19.5
child 18.4
outdoor 18.4
beach 17.8
sunset 16.2
male 15.1
park 14.8
active 14.4
summer 14.2
lawn mower 13.5
water 13.3
sea 13.3
person 13.2
vacation 13.1
sand 12.3
autumn 12.3
adult 12.3
outside 12
fun 12
tool 11.9
sport 11.9
love 11
garden tool 10.8
silhouette 10.8
recreation 10.8
mountain 10.7
sky 10.2
ocean 10
sun 9.7
couple 9.6
men 9.4
lifestyle 9.4
shore 9.3
leisure 9.1
old 9.1
activity 9
landscape 8.9
kid 8.9
forest 8.7
boy 8.7
day 8.6
holiday 8.6
two 8.5
travel 8.5
relax 8.4
danger 8.2
family 8
rural 7.9
women 7.9
country 7.9
cold 7.7
hiking 7.7
tree 7.7
dusk 7.6
walk 7.6
walking 7.6
field 7.5
happy 7.5
holidays 7.5
life 7.3
countryside 7.3
girls 7.3
dirty 7.2
sexy 7.2
coast 7.2
farm 7.1
sunlight 7.1
grass 7.1
work 7.1
little 7.1

Google
created on 2022-02-26

Wheel 92.6
Black 89.6
Adaptation 79.3
Plant 76.1
Monochrome photography 70.4
Toddler 69.7
Motor vehicle 66.6
Vintage clothing 66.5
Working animal 65.5
Monochrome 62.5
Water 59.8
Baby 57.1
Tire 55.9
Baby Products 55.5
History 52.7
Child 50.8
Room 50.7
Sitting 50.6

Microsoft
created on 2022-02-26

outdoor 97.1
clothing 90
text 86.5
person 86.3
footwear 70
child 63.4

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Male, 94%
Calm 96.4%
Happy 1.3%
Sad 1.2%
Surprised 0.3%
Disgusted 0.3%
Confused 0.3%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 22-30
Gender Female, 99.8%
Calm 83.8%
Surprised 8.1%
Sad 6.2%
Fear 0.8%
Happy 0.7%
Disgusted 0.3%
Angry 0.2%
Confused 0.1%

AWS Rekognition

Age 28-38
Gender Female, 87.2%
Calm 98.6%
Happy 0.5%
Sad 0.4%
Surprised 0.3%
Confused 0.1%
Disgusted 0.1%
Angry 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Wheel 89.9%
Dog 83.1%
Bicycle 62.7%

Captions

Microsoft

a group of people standing in front of a building 77.1%
an old photo of a person 77%
a group of people in a field 76.9%

Text analysis

Amazon

YT33AS
MJIR YT33AS
MJIR

Google

MJIR
YT3RA2
002M,
MJIR YT3RA2 002M,