Human Generated Data

Title

Untitled (boy steering toy car in room decorated for Christmas)

Date

1948

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9157

Human Generated Data

Title

Untitled (boy steering toy car in room decorated for Christmas)

People

Artist: Martin Schweig, American 20th century

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 97.3
Human 97.3
Person 93.4
Wheel 90.7
Machine 90.7
Transportation 89.6
Vehicle 89.6
Clothing 86.7
Apparel 86.7
Car 83.1
Automobile 83.1
Face 81
Chair 80.7
Furniture 80.7
Street 79.1
Building 79.1
Town 79.1
Urban 79.1
Road 79.1
City 79.1
Shorts 75.7
Play 72
Child 71.7
Kid 71.7
Wheel 69.8
Outdoors 69.6
Path 69.3
Portrait 68.2
Photo 68.2
Photography 68.2
Tree 63.3
Plant 63.3
Female 63
Girl 63
Person 61.5
People 61.4
Nature 59.6
Pants 56.9
Buggy 56.6
Boy 56
Tarmac 55.9
Asphalt 55.9
Tire 55.7

Imagga
created on 2022-01-23

barbershop 48.9
shop 39.8
mercantile establishment 30.9
vehicle 30.9
car 29.7
motor vehicle 21.7
transportation 21.5
place of business 20.7
golf equipment 19.4
road 18.1
equipment 17.2
man 16.8
machine 15.5
people 15.1
transport 14.6
sports equipment 14.6
drive 14.2
working 14.1
work 14.1
industry 13.7
wheeled vehicle 13.7
driving 13.5
travel 13.4
wheel 13.3
male 12.8
industrial 12.7
automobile 12.4
building 11.9
old 11.8
person 11.8
auto 11.5
computer 11.3
chair 10.7
engine 10.6
establishment 10.4
street 10.1
speed 10.1
danger 10
city 10
business 9.7
sky 9.6
room 9.6
men 9.4
adult 9.1
black 9
technology 8.2
worker 8.1
job 8
urban 7.9
laptop 7.7
construction 7.7
outdoor 7.6
truck 7.6
safety 7.4
seat 7.4
back 7.3
engineer 7.2
home 7.2
steel 7.1
day 7.1
architecture 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 98.2
outdoor 93.5
vehicle 88.1
land vehicle 87.3
wheel 82.2
black and white 81.4
car 51.2

Face analysis

Amazon

Google

AWS Rekognition

Age 13-21
Gender Male, 62.4%
Sad 77.3%
Calm 20.7%
Happy 0.5%
Angry 0.4%
Disgusted 0.4%
Fear 0.3%
Confused 0.3%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.3%
Wheel 90.7%
Chair 80.7%

Captions

Microsoft

a person riding on the back of a truck 54.2%
a person sitting in front of a truck 54.1%
a person riding on the back of a truck 46.6%

Text analysis

Amazon

CR
-
MJIF
8
13150
حية -
MJIF YE33AB ОСЛИА
CAST
YE33AB
ОСЛИА
حية

Google

I
(0
Bill
31S
YT3RA2
Bill I 31S (0 MJIA YT3RA2 02MA
02MA
MJIA