Human Generated Data

Title

Untitled (two boys playing in wheelbarrow)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17552

Human Generated Data

Title

Untitled (two boys playing in wheelbarrow)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Human 99.7
Person 99.7
Person 97.2
Vehicle 91
Transportation 91
Face 90.8
Clothing 85.6
Apparel 85.6
Bench 85.6
Furniture 85.6
Outdoors 85.3
Plant 80.6
Wheel 79.4
Machine 79.4
Chair 75.6
Nature 73.4
Portrait 68.7
Photo 68.7
Photography 68.7
Tree 63.3
Kid 59.4
Child 59.4
Grass 59.3
Vegetation 59.1
Wheelbarrow 58
Barrow 58
Yard 56.8
Shorts 55.5

Imagga
created on 2022-02-26

barrow 100
handcart 100
wheeled vehicle 98.6
vehicle 64.2
conveyance 33.8
outdoors 26.2
man 26.2
people 25.1
happy 23.2
bench 22.6
male 21.3
person 21
child 20.7
park 20.6
outdoor 19.9
adult 18.8
couple 17.4
sitting 16.3
laptop 15.5
lifestyle 15.2
mother 14.8
computer 14.4
family 14.2
grass 14.2
day 14.1
outside 13.7
happiness 13.3
wheel 13.2
together 13.1
smiling 13
smile 12.8
wheelchair 12.6
autumn 12.3
old 11.8
work 11.8
seat 11.6
boy 11.3
technology 11.1
love 11
summer 10.9
kid 10.6
chair 10.5
garden 10.1
cute 10
cart 10
leisure 10
active 9.9
care 9.9
attractive 9.8
senior 9.4
tree 9.2
relaxation 9.2
relaxing 9.1
portrait 9.1
wheelbarrow 8.9
working 8.8
park bench 8.8
father 8.8
elderly 8.6
sit 8.5
togetherness 8.5
joy 8.4
fun 8.2
fall 8.2
cheerful 8.1
scholar 8
water 8
women 7.9
disabled 7.9
full length 7.8
retired 7.8
talking 7.6
gardening 7.6
relax 7.6
path 7.6
communication 7.6
field 7.5
relaxed 7.5
wood 7.5
city 7.5
parent 7.5
landscape 7.4
business 7.3
children 7.3
student 7.2
looking 7.2
childhood 7.2
job 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

outdoor 99.8
grass 99.1
black and white 92.7
text 91.8
person 89.9
clothing 88.3
handcart 71.3
park 60.5
man 58.1

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Female, 93.7%
Calm 52.6%
Happy 34.1%
Surprised 6.5%
Fear 2.5%
Sad 1.4%
Disgusted 1.2%
Angry 1%
Confused 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Bench 85.6%
Wheel 79.4%

Captions

Microsoft

a person sitting on a bench 90%
a person sitting on a bench 89.9%
a person sitting on a park bench 80.4%

Text analysis

Amazon

I3S
YIGNAC