Human Generated Data

Title

Untitled (Man holding baby in arms in street)

Date

1950s

People

Artist: Leon Levinstein, American 1910 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of John Erdman and Gary Schneider from the Helen Gee Collection, 2016.403

Human Generated Data

Title

Untitled (Man holding baby in arms in street)

People

Artist: Leon Levinstein, American 1910 - 1988

Date

1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of John Erdman and Gary Schneider from the Helen Gee Collection, 2016.403

Machine Generated Data

Tags

Amazon
created on 2023-07-06

City 100
Road 100
Street 100
Urban 100
Face 100
Head 100
Photography 100
Portrait 100
Alley 100
Neighborhood 98.7
Person 98.6
Baby 98.6
Person 98
People 97
Path 95.6
Person 94.1
Person 92.6
Clothing 77.9
Footwear 77.9
Shoe 77.9
Person 75.6
Accessories 63.7
Glasses 63.7
Beard 57.8
Sunglasses 57.5
Sidewalk 57.4
Metropolis 57.3
T-Shirt 56.7
Architecture 56.3
Building 56.3
Condo 56.3
Housing 56.3
Office Building 56
Coat 56
Body Part 55.5
Finger 55.5
Hand 55.5
High Rise 55.5
Jacket 55.4
Selfie 55.2

Clarifai
created on 2023-10-13

people 100
street 99.8
man 99
two 98.8
adult 98.8
child 95.4
monochrome 94.9
group 94.8
portrait 93.2
woman 93.1
three 92.8
interaction 92.2
group together 90.3
administration 89.9
music 88.2
one 87.8
elderly 85.4
family 82.3
road 82.2
several 81.8

Imagga
created on 2023-07-06

man 39
male 33.5
person 25.6
people 24.5
men 21.5
adult 20.1
black 18.1
device 17.7
human 15.7
weapon 14.8
equipment 14.5
portrait 14.2
work 13.3
hand 12.9
photographer 12.8
gun 12.8
guy 11.7
safety 11
occupation 11
helmet 10.9
worker 10.7
danger 10
one 9.7
sport 9.6
tool 9.6
head 9.2
power 9.2
entertainment 9.2
close 9.1
job 8.8
body 8.8
cigarette 8.8
military 8.7
hands 8.7
lifestyle 8.7
professional 8.6
model 8.6
business 8.5
two 8.5
glasses 8.3
training 8.3
protection 8.2
machinist 7.9
couple 7.8
criminal 7.8
face 7.8
crime 7.8
player 7.8
repair 7.7
outdoors 7.6
mask 7.5
musical instrument 7.5
pistol 7.5
leisure 7.5
active 7.5
smoke 7.4
music 7.3
sexy 7.2
looking 7.2
father 7.1
hair 7.1
women 7.1

Google
created on 2023-07-06

Building 92.5
White 92.2
Black 89.6
Window 86.5
Black-and-white 86.2
Gesture 85.3
Style 84.1
Monochrome 75.2
Monochrome photography 74.7
Happy 74.4
Toddler 73.1
Road 72.3
Smile 72.2
City 70.1
Art 69.9
Event 67.6
Baby 66.2
Pedestrian 65
Street 64
Fun 62.8

Microsoft
created on 2023-07-06

outdoor 99
person 98.7
black and white 91.5
text 79
clothing 76.8
man 73.3
city 51.2
crowd 0.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 0-6
Gender Female, 51.6%
Calm 98.1%
Surprised 6.3%
Fear 5.9%
Sad 2.3%
Happy 0.9%
Disgusted 0.2%
Confused 0.1%
Angry 0.1%

AWS Rekognition

Age 27-37
Gender Female, 96.2%
Calm 97.4%
Surprised 6.3%
Fear 5.9%
Sad 3%
Confused 0.1%
Angry 0%
Happy 0%
Disgusted 0%

AWS Rekognition

Age 16-24
Gender Male, 77.3%
Calm 97.3%
Surprised 6.3%
Fear 5.9%
Sad 2.9%
Confused 0.3%
Angry 0%
Happy 0%
Disgusted 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.6%
Baby 98.6%
Shoe 77.9%

Text analysis

Google

100 KIRT
100
KIRT