Human Generated Data

Title

Untitled (Stewart Model et al., Africa)

Date

1915

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3180

Human Generated Data

Title

Untitled (Stewart Model et al., Africa)

People

Artist: Unidentified Artist,

Date

1915

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3180

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.7
Human 99.7
Person 99.6
Person 99.6
Person 99.3
Person 99
Person 97.9
Wheel 96
Machine 96
Person 95.4
Automobile 94.8
Vehicle 94.8
Transportation 94.8
Model T 92.8
Antique Car 92.8
Car 83.5
Building 68.7
Wheel 64.1
Wheel 56
Countryside 55.1
Nature 55.1
Outdoors 55.1
Person 49
Person 44.1

Clarifai
created on 2023-10-25

people 100
vehicle 99.6
adult 99.2
man 98.5
transportation system 98.4
group together 98
woman 97.5
group 97.2
child 97.2
boy 95.7
two 95.4
cropland 94
driver 92.6
three 91.5
monochrome 86.6
girl 85.9
vintage 85.7
cavalry 85.5
street 84.3
four 84.3

Imagga
created on 2022-01-08

vehicle 55.6
hay 42.1
wheeled vehicle 35.2
tricycle 32.1
tractor 31.8
farm 30.3
machine 29.4
car 27.1
grass 26.9
fodder 26.4
feed 25.6
agriculture 25.5
rural 24.7
old 23.7
field 23.4
plow 21.8
equipment 21.4
machinery 19.5
transport 19.2
transportation 18.8
truck 18.1
wheel 18.1
landscape 17.9
farming 17.1
tool 16.6
outdoor 16.1
conveyance 15.7
summer 15.4
model t 15.2
tree 14.7
industry 14.5
work 14.1
autumn 14.1
farmer 14.1
food 13.8
motor vehicle 13.5
sky 13.4
harvest 13.2
countryside 12.8
agricultural 12.7
working 12.4
land 12
industrial 11.8
abandoned 11.7
automobile 11.5
rusty 11.4
tire 11.1
road 10.8
vintage 10.8
park 10.7
antique 10.6
country 10.5
auto 10.5
heavy 10.5
outdoors 10.5
crop 10.3
wagon 10.3
outside 10.3
earth 10.1
farm machine 9.9
wheels 9.8
rust 9.6
dirt 9.6
grain 9.2
hot 9.2
fall 9.1
adult 9.1
thresher 8.7
farmland 8.7
cart 8.7
driving 8.7
yellow 8.6
people 8.4
trailer 8.3
man 8.1
tires 7.9
day 7.8
forest 7.8
track 7.8
engine 7.7
device 7.6
harvester 7.6
drive 7.6
hill 7.5
danger 7.3
metal 7.2
dirty 7.2
steamroller 7.2
meadow 7.2
trees 7.1
spring 7.1
wheelchair 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

outdoor 99.8
grass 98.1
land vehicle 97
wheel 96.1
old 95.9
vehicle 95.2
people 77.3
tire 75.8
car 65.1
text 64.8
person 63.8
posing 54.2
tractor 35.3
family 17.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 20-28
Gender Male, 99.9%
Calm 99.9%
Angry 0%
Confused 0%
Surprised 0%
Sad 0%
Disgusted 0%
Fear 0%
Happy 0%

AWS Rekognition

Age 45-51
Gender Male, 100%
Sad 86.8%
Calm 6.8%
Confused 3%
Disgusted 1.4%
Angry 0.9%
Surprised 0.6%
Fear 0.4%
Happy 0.2%

AWS Rekognition

Age 23-31
Gender Male, 99.8%
Calm 87.6%
Angry 7.2%
Sad 2.6%
Confused 1.6%
Surprised 0.3%
Fear 0.3%
Disgusted 0.2%
Happy 0.2%

AWS Rekognition

Age 50-58
Gender Male, 100%
Calm 37.2%
Angry 35.4%
Sad 9.8%
Disgusted 5.1%
Surprised 4.1%
Confused 3.9%
Fear 3.4%
Happy 1.1%

AWS Rekognition

Age 27-37
Gender Male, 99.9%
Calm 98.4%
Disgusted 0.6%
Surprised 0.3%
Sad 0.2%
Happy 0.2%
Angry 0.1%
Confused 0.1%
Fear 0.1%

AWS Rekognition

Age 24-34
Gender Male, 99.7%
Sad 99.2%
Fear 0.3%
Angry 0.1%
Calm 0.1%
Confused 0.1%
Disgusted 0.1%
Happy 0%
Surprised 0%

AWS Rekognition

Age 24-34
Gender Female, 96.8%
Sad 79%
Calm 7.3%
Fear 6.1%
Confused 4.1%
Happy 1.5%
Disgusted 0.8%
Angry 0.6%
Surprised 0.5%

AWS Rekognition

Age 31-41
Gender Male, 96.7%
Calm 96.2%
Surprised 2.2%
Sad 0.9%
Fear 0.2%
Disgusted 0.1%
Happy 0.1%
Angry 0.1%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Wheel 96%
Car 83.5%

Categories