Human Generated Data

Title

Untitled (children drag-racing in tiny cars)

Date

1960

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16041

Human Generated Data

Title

Untitled (children drag-racing in tiny cars)

People

Artist: Jack Gould, American

Date

1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16041

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.8
Human 99.8
Person 99.2
Person 99.2
Clothing 98.8
Apparel 98.8
Person 98.7
Helmet 97.9
Person 97.6
Automobile 93.9
Vehicle 93.9
Transportation 93.9
Person 81.7
Helmet 76.1
Car 74.3
Formula One 71.8
Person 69.3
Crash Helmet 67.6
Hardhat 66.3
Race Car 65.5
Sports Car 65.5
Person 65
Person 63.4
Person 62.9
Bobsled 59.3
Sled 59.3
Watercraft 56.9
Vessel 56.9
Boat 55.3
Person 55
Person 49.2

Clarifai
created on 2023-10-29

people 98.9
woman 94.4
vehicle 94
adult 93.7
competition 93.1
man 92.7
group together 92.4
street 91.4
group 90.8
action 89.1
race 88.3
sport 88.2
transportation system 87.4
race (competition) 85.7
many 84.9
car 84.4
track 82.8
hurry 81.9
fast 80
championship 79.2

Imagga
created on 2022-02-05

bobsled 83.9
sled 67
vehicle 52.5
conveyance 28.1
people 25.6
man 24.2
car 17.6
fun 16.5
male 16.3
person 15.5
spectator 15.4
lifestyle 14.4
adult 13.6
happy 13.1
leisure 12.4
smile 12.1
racer 11.9
women 11.9
smiling 11.6
sitting 11.2
group 10.5
attractive 9.8
newspaper 9.6
love 9.5
outdoors 8.9
sexy 8.8
indoors 8.8
couple 8.7
youth 8.5
business 8.5
friends 8.4
black 8.4
portrait 8.4
summer 8.4
ocean 8.3
fashion 8.3
human 8.2
alone 8.2
indoor 8.2
motor vehicle 8.1
boat 7.9
sport 7.7
life 7.7
one 7.5
water 7.3
dress 7.2
home 7.2
team 7.2
holiday 7.2
work 7.1
sea 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

person 99.8
text 98.9
clothing 95.9
car 82.1
vehicle 81.8
woman 73.7
man 71.6
land vehicle 61.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Male, 99.5%
Angry 99.4%
Disgusted 0.2%
Confused 0.1%
Sad 0.1%
Calm 0.1%
Surprised 0.1%
Happy 0%
Fear 0%

AWS Rekognition

Age 4-10
Gender Female, 100%
Calm 87.9%
Confused 8.1%
Surprised 2.9%
Happy 0.5%
Angry 0.2%
Disgusted 0.2%
Sad 0.1%
Fear 0.1%

AWS Rekognition

Age 45-51
Gender Male, 99.3%
Sad 89.8%
Calm 8.6%
Fear 1.2%
Happy 0.2%
Angry 0%
Confused 0%
Disgusted 0%
Surprised 0%

AWS Rekognition

Age 23-31
Gender Female, 86.8%
Sad 96.9%
Calm 1.5%
Happy 1%
Fear 0.2%
Surprised 0.1%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person
Helmet
Car
Person 99.8%
Person 99.2%
Person 99.2%
Person 98.7%
Person 97.6%
Person 81.7%
Person 69.3%
Person 65%
Person 63.4%
Person 62.9%
Person 55%
Person 49.2%
Helmet 97.9%
Helmet 76.1%
Car 74.3%

Categories

Imagga

people portraits 49.6%
food drinks 36.8%
events parties 12.1%

Text analysis

Amazon

MIDGET
1/4 MIDGET SALES
7
1/4
SALES
JA.1-8367
YT37
AK
V MINGET SALES
MJI7 YT37
MJI7
YOD AK S.A الخلكا
a
YOD
S.A
V MINGET
الخلكا

Google

¼ MIDGET SALES JA.1-8367 MJI YT 3 Min GET SALES
¼
MIDGET
SALES
JA.1-8367
MJI
YT
3
Min
GET