Human Generated Data

Title

Untitled (drivers drag-racing in tiny cars, woman sitting in car)

Date

1960

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16036

Human Generated Data

Title

Untitled (drivers drag-racing in tiny cars, woman sitting in car)

People

Artist: Jack Gould, American

Date

1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16036

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.6
Human 99.6
Person 99.4
Person 98.7
Person 98.6
Person 98.4
Person 97.9
Chair 91.4
Furniture 91.4
Wheel 85.2
Machine 85.2
Person 78
Wheel 76.1
Vehicle 72.3
Transportation 72.3
Wheel 72.2
Wheel 71.2
People 66
Spoke 65.1
Alloy Wheel 58.1
Kart 57.8
Tire 57.5
Screen 56.1
Electronics 56.1
Person 51.3

Clarifai
created on 2023-10-29

people 99.6
child 98.4
street 98.1
vehicle 97.2
transportation system 96.4
cart 95.6
wheel 95.4
city 94.6
group 94.6
carriage 94.2
woman 94.1
roll along 94
girl 93.4
man 93.4
fun 92.2
recreation 89
adult 88.8
group together 88.8
travel 88.7
toy 88.5

Imagga
created on 2022-02-05

tricycle 56.1
wheeled vehicle 50.7
vehicle 40.1
man 25.5
wheelchair 25.5
conveyance 24.9
people 20.1
male 17.8
chair 17.6
person 17.1
adult 16.4
equipment 15.3
backboard 15.3
outdoors 14.2
wheel 13.2
old 12.5
sport 12.3
happy 11.3
seat 11.1
portrait 11
day 10.2
bicycle 9.9
active 9.9
care 9.9
disabled 9.9
bike 9.8
human 9.7
lady 9.7
lifestyle 9.4
brass 9
medical 8.8
couple 8.7
elderly 8.6
travel 8.4
health 8.3
sunset 8.1
smiling 8
park 7.8
model 7.8
sick 7.7
summer 7.7
attractive 7.7
sky 7.6
senior 7.5
fun 7.5
building 7.3
aged 7.2
recreation 7.2
wind instrument 7.2
love 7.1
job 7.1
working 7.1

Google
created on 2022-02-05

Tire 97.3
Wheel 96.5
Motor vehicle 91.5
Product 90.8
Vehicle 90.5
Mode of transport 85.9
Adaptation 79.3
Snapshot 74.3
Toddler 73.6
Riding toy 73.2
Rectangle 68.2
Rolling 67.9
Art 67.8
Street 62.4
Baby 62.3
Illustration 62
Visual arts 61.9
Baby carriage 61.9
Baby Products 60.8
Asphalt 60.7

Microsoft
created on 2022-02-05

outdoor 97.7
text 97
person 92.9
vehicle 84.7
land vehicle 79.5
wheel 77.8
clothing 77.3
cart 69

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Female, 85.4%
Calm 63.9%
Confused 18.7%
Angry 11.5%
Disgusted 2.4%
Surprised 1.6%
Sad 0.8%
Happy 0.6%
Fear 0.4%

AWS Rekognition

Age 33-41
Gender Male, 96%
Calm 98.1%
Disgusted 0.7%
Surprised 0.4%
Happy 0.3%
Confused 0.1%
Sad 0.1%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 26-36
Gender Female, 67.3%
Calm 98.8%
Sad 0.5%
Disgusted 0.2%
Happy 0.2%
Confused 0.2%
Angry 0.1%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 20-28
Gender Male, 87.7%
Calm 95.7%
Confused 2.1%
Happy 0.8%
Sad 0.7%
Angry 0.2%
Disgusted 0.2%
Surprised 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Wheel
Person 99.6%
Person 99.4%
Person 98.7%
Person 98.6%
Person 98.4%
Person 97.9%
Person 78%
Person 51.3%
Wheel 85.2%
Wheel 76.1%
Wheel 72.2%
Wheel 71.2%

Categories

Imagga

paintings art 92.2%
pets animals 2.8%
food drinks 2.3%

Text analysis

Amazon

S
TH
3
ع
0
..
SC
.. m SC TH ENT NNER .00
.00
LIFW
ENT
KAGON
NNER
-08-
m

Google

KODVK YT37A2 1 - TH
KODVK
YT37A2
1
-
TH